Program Configuration
A program is a way to group tests that share reporting features and visibility settings. This allows you to compare by content area and grade, and track student performance over time. A test must be assigned to a program to be available in the Performance Report. Programs are designed to work for a single academic year, not multiple years. Typically, programs are not split out by grade, test window, or subject as those filters are already available in reports; an exception would be in they have different reporting requirements. Consider programs test categories or "buckets." Examples of program names include Spring Summative, Diagnostics, Checkpoints, Interims and Benchmarks.
In most cases, your Pearson program delivery team will configure programs after working with site administrators to determine reporting requirements. Access is controlled be the user role configuration: 'Edit Program Objectives.'
Programs are required for dynamic reporting in ADAM. Customers who only extract data and do not do any reporting within ADAM are not required to link tests to programs, but typically do.
See also: Individual Student Report (ISR) Setup, Considerations for Program Configuration, Reportable Item Group
View Programs
Go to Test Management > Program Configuration and do one of the following:
- Click the Create New to set up a new program. Details below.
- Click the Edit icon in the in the Actions column to modify an existing program.
- Click the Clone icon in the Actions column to duplicate an existing program and modify as needed.
- In the Archived column, toggle on a program to archive it. The interface loads only non-archived programs automatically, but users can use the 'Show Archived' checkbox on the top of the dashboard to show all programs, if needed. Archived programs will not be listed in the Program drop list in Administrations > Filter, Reporting > Progress, Reporting > Performance, Reporting > Results Explorer or Operations > Session Explorer.
Create a New Program
Important: After assigning a test to a program, the program's score type selections, windows, grade levels, and content areas cannot be changed. Make sure to carefully plan and configure the program before assigning tests. See also: Considerations for Program Configuration.
- Go to Test Management > Program Configuration.
Refer to the expandable sections below to configure the new program.
Click Save periodically.
After creating a program, you can:
- Assign tests to the program when creating a test from Test Management > Tests.
- Override some program settings at the test level. For Performance Levels, cut scores and labels can differ by test, but you cannot change which levels count as proficient, the color, or add/remove levels.
Complete initial program setup on the Settings tab. Define when, who, and what the program is used for.
Note: You must define a primary performance objective before you can save data in Settings.
General
- Program Name: Required. This name will be used when selecting a program in the performance reports, so it should be easy to recognize.
- Program Code: A unique code generated automatically based on the program name (if no code is entered initially). Used internally.
- ADAM Scored: Select for tests delivered through the ADAM player. Leave this checkbox unchecked for tests delivered externally (such as through TestNav). A program cannot contain both ADAM and external tests.
- Active: Select to make the program available when assigning a test to a program and in reporting. You can leave this unchecked until you are ready to use the program.
- Available Testing Windows: Required. Enter one or more window names (such as Fall, Spring). These are simply labels to facilitate reporting over time and do not control when the student can test. You can order the window sequence with the dragger bars so that report show the older test to the newer test. This cannot be changed once testing begins.
- Available Grades: Required. Select one or more grade levels to use in the program. Grade bands are available at the bottom of the list.
- Available Content Areas: Required. Select one or more content areas used in the program, such as Algebra or ELA (these are defined in System > Client Settings). Content areas enable you to have multiple assessments for the same subject and area (e.g., Algebra II and Geometry).
- School Year: Select the school year for the program. The year menu options are hardcoded and the year refers to the school year start (e.g., 24 is 24-25).
- Available Accountability Codes: This enables the ability to determine by Program which accountability codes are available to assign to a session. With this feature, you can display different sets of accountability codes for each Program. For example, one set of accountability codes that can be assigned for a Summative program and another set of codes defined for the Interim program.
Report Visibility
- Program Report: Program Reports compare results from different tests in the same program and support reporting by demographics.
- Student Performance Report: See also: Student Performance Report
- Class Performance Report: See also: Class Performance Report
- Teacher Performance Report: See also: Teacher Performance Report
- Class Reports: Select to allow users to access class-level reports in My Classes and Reporting for tests in the program.
- Student Results: Show combined view of all student results.
Report Settings
- Item Analysis Reporting: If selected, Item analysis reporting will be provided for tests using this program.
- Show item content, correct answer and student responses
- Show correct answer: Disabled if 'Show item content, correct answer and student responses' is selected.
- Show individual responses: If selected, individual response states are viewable in the Student Performance Report.' Show item content' must also be selected.
- Show trait level reporting: Available for manually scored items.
- Standards Reporting: Standard sub-scores will be calculated, and standard reporting will be available to teachers and administrators. For ADAM-delivered test only.
- Hide aligned items details: If checked, aligned item details will not be visible on Standard Performance report.
- Allow Student Performance Test Comparison: Previously, the comparison was always available on the report and remains the default value. Unchecking the control will remove the compare widget from the student performance report.
- Preliminary Reporting: The setting allows preliminary results (those sessions that are SCORE PENDING along with SCORE COMPLETE) to appear on reports. Users are either viewing preliminary reports or final reports. Final reports only use SCORE COMPLETE sessions. When the setting is set to true, a ‘Preliminary Report’ badge is displayed with a popover noting that scores are not final. Preliminary reporting does not include item analysis, standard performance, nor Individual Student Reports (ISRs).
- Review Only: If checked only Program Reviewers will see this program.
- Review Only (Ops): If checked only Program Reviewers will see this program in Operations filters.
- Only Report Battery Results: Only show battery test results. Test results will not be accessible in the ADAM user interface.
- Show Student Their Results: Allows students to see their results in ADAM. For ADAM-delivered test only.
- Show Student Item View: Allows students to see the test items in ADAM. For ADAM-delivered test only.
- Performance Comparison Parent Org Type: Select a level from the menu. Specifies the type of the highest non-scoped parent org that is always visible in the performance report. If empty, only the user's scoped orgs are visible.
- Reportable Item Group Report Label: Change default name if desired.
In the Objectives tab, define the performance objectives that you want to appear in reports, such as raw score. The available objectives will vary based on whether the Scored checkbox is selected in General Settings. Raw Score and Percent Correct objectives appear by default.
Note: The objective configuration is a template that can be changed at the test level. For example, if you define cut scores here, they can be changed on individual tests when assigning them to the program.
- Raw Score (RAW_SCORE): Always available. Reports show raw score if selected as a primary or secondary objective.
- Percent Correct (PERCENT_CORRECT): Always available. Reports show percent correct if selected as a primary or secondary objective.
To add additional objectives, select from the Add Objective dropdown. You can rename objectives in the Objective Name field.
Performance Level (PERF_LEVEL):
- Usually set as the primary objective when used.
- Input Variable: Select RAW_SCORE or PERCENT_CORRECT (or another input based on an objective you have created). This value will be used to determine the output score.
- Performance Level Settings: Enter the Label, Text description (such as "Above expectations"), and Cut Score.
- Check the Proficient box when applicable for the performance level. Levels tagged as Proficient are used to calculated the Proficiency value in Performance report
- Optionally, select new colors by selecting an existing color. Select Add to create more levels.
Lookup (LOOKUP):
- Enter a database key in the Objective Key field, such as "LOOKUP_Scale." This field CANNOT be blank.
- Input Variable: Select RAW_SCORE or PERCENT_CORRECT (or another input based on objectives you have created). This value will be used to determine the output score.
- Lookup Settings: Enter data in the Cut Score and Output fields. You can also import a CSV file.
- The cut score starts at zero, so the lowest value you enter can be greater than zero, but you may use zero if you define a cut score for every raw score.
- Optionally, select the Use Color Picker checkbox to customize the colors of the score groups. Select Add to create more levels.
Dynamic Metadata (ITEM_METADATA):
- Enter a database key in the Objective Key field, such as "DYNAMIC_METADATA." This field CANNOT be blank.
- Score Calculation: Select NUMBER or PERCENT.
- Select Metadata: Select a Bank, then select a Metadata Field.
- You can only select one Metadata Field for each Dynamic Metadata objective.
- You can only select one Metadata Field for each Dynamic Metadata objective.
External (EXTERNAL):
- Only available when the ADAM Scored checkbox is NOT selected in General Settings (i.e., tests are delivered in TestNav).
- Objective Key must match the external passed data, for example "SCALE_SCORE."
- Score Calculation: Select NUMBER or PERCENT.
- Range Designation: Enter the default low and high scaled scores or number of questions, with 0 as minimum.
- The range for each test can be defined later when tests are added to the program.
- The range for each test can be defined later when tests are added to the program.
Primary, Secondary, and Minor Objectives
A program must have a primary performance objective, and optionally, secondary and minor objectives. The primary objective is the main focus of the report and will be emphasized on the report. Unlike secondary objectives, minor objectives will not show on the main summary screen and student performance screens. For each objective, select Primary, Secondary or Minor.
For example, the primary objective may be Performance Level and the secondary objective may be Raw Score Percent Correct. Performance reports will emphasize score groups (with colors) defined in the Performance Level objective, while the secondary objective provides the score used to determine the performance levels. You may define additional objectives that feed data to the primary or secondary objectives shown on the reports.
Defining the report column order. The Outcome columns are displayed in the report based on the sequence of the configured objectives. Use the up/down arrows in the widget title bar to move objectives up or down. You can modify the order after linking tests to the program.
Not Applicable checkbox. Select the Not Applicable checkbox for an objective that you want to hide on the Performance Report; for example, if you don't want to show a Lexile score for a math test.
Use the Battery Edit tab to set up objectives for battery tests. After defining objectives here, they can be used on the ISR Settings tab by selecting Use Battery Outcome in an ISR widget. The Battery Edit tab is only available when battery testing is enabled in Client Settings (see Client Settings: Battery). The battery tests listed on this tab are populated via TestNav form sync based on the battery definitions defined in the scoring test maps.
- If an expected battery test is not listed, select it from the Create Battery Association dropdown. The battery association will be created in ADAM, and the battery will now appear in the table.
- Click the Edit icon to define that performance objectives for the battery used on the Performance Report. The following objectives are supported:
- Performance Level
- External
- Deleting a battery does not delete the battery association; only the objectives currently defined for the battery are deleted.
The Test Edit tab lists tests assigned to the program. From here, you can:
- Select the checkbox in the Review Reports column to place all reports for the test under review. Users with permission can review the test reports, and then clear the checkbox here or in Test Configuration to make the reports available to other users.
- Click the Edit icon in the Actions column to open Test Configuration.