Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

Interim assessments contain secureitems that educators are able to view in CRS, but that should not be printed, downloaded or shared in any way. Reports for individual interim tests may include the following:

  • Item-level data

  • Access to the items themselves

  • Access to student responses to the items

Test results for adaptive assessments include item-level data only on the individual student level.

On this Page

Table of Contents
minLevel4
maxLevel6
outlinefalse
styledefault
typelist
printablefalse

How to View Standards for Each Item

In a report displaying item-level data, you can view the standard or standards to which each item is aligned. This allows you to determine at a glance what the item measures.

To show and hide item standards, click the Standards Keys Toggle button in the row of filter details below the report table heading (refer to Icons). Under each item number appears a standard key or list of standard keys (refer to Figure 54). Note that this toggle does not affect printouts or exports, which always include the standard keys when they include item-level data.

Click the More Information button beside the standard keys to view legends displaying the Reporting Category, Knowledge and Skill Statement, and Student Expectation as in Figure 57 (refer to Icons). This full text is not included in printouts or exports.

Table 23. District Performance on Test Report Elements

#

Element

1

Standard Keys toggle

2

Standard keys below item numbers

How to View an Item

You can view the actual items themselves, along with student responses to those items.

Do either of the following (refer to Figure 58):

  • To view the item in a bank, click the item number in the first row of the report table.

  • To view the student’s response to the item, find that student’s name in the Student column on the left. Then click the score the student obtained on that item.

The Item View window appears (refer to Figure 59). It contains an Item & Score tab and a Rubric & Resources tab. A banner at the top of the window displays the item’s number, score (when the item includes the student’s response), and confidence level (when a machine-suggested score has a low confidence level). The Item & Score tab shows the item and may include a particular student’s response.

The Item & Score tab may include the following sections:

  • Scoring Assertion: Each scoring assertion contains both a statement that provides information about what the student did in their response, and the content knowledge, skill, or ability that is evidenced by their response. When you are viewing a student’s response and the item has scoring assertions, the Scoring Assertion table appears, listing each assertion and outcome (refer to Figure 58).

  • Item: Displays the item as it appeared on the assessment in the Student Testing Site. For items associated with a passage, the passage also appears.

  • The Rubric & Resources tab (refer to Figure 59) may include the following sections, which you can expand and collapse by clicking Expand button and Collapse button, respectively (refer to Icons).

  • Details: May provide the following information:

    • Topic: Skill area to which the item belongs

    • Content Alignment: Describes the standard to which the item is aligned

  • Resources: Provides links to any exemplars or training guides available for the item

  • Rubric: Displays the criteria used to score the item. This section may also include a score breakdown, a human-readable rubric, or an exemplar, which provides an example of a response for each point value.

  • Frequency Distribution of Student Responses: The table in this section provides a breakdown of how many students in the campus earned each possible point value available for a fixed-form test item.

Table 24. My Students' Performance on Test Report: Performance by Student Tab Elements

#

Element

1

Item number (click to view item without student response)

2

Item score for a particular student (click to view item with student response)

Table 25. Item View Window Elements

#

Element

1

Item & Score tab (selected)

2

Rubric & Resources tab

How to View Items With and Without the Students’ Visual Settings

When viewing items with students’ responses, you may or may not want to see the items exactly the way the students saw them on the test. For example, some students’ tests are set to use large fonts, different color contrast, or Spanish.

  1. Click the Features and Tools menu in the banner and select Set Student Setting on Item View (refer to Figure 62). The Set Student Setting on Item View window appears (refer to Figure 63).

  2. Select Yes to show students’ visual settings on all items or No to hide them.

  3. Click Save.

You can also show or hide visual settings on a per-item basis. To do so, click the toggle at the upper-right of the item you are viewing. This action has no effect on your global setting.

How to Navigate to Other Items from the Item View Window

Use the buttons Previous Item and Next Item labeled with the previous and next item numbers at the upper corners of the Item View window (refer to Icons).

How to View Another Student's Response to the Current Item

If you have accessed the student’s response from a report showing multiple students, you can click the arrows beside the Demo Student Field button at the top of the window (refer to Icons). Students are listed in the same order in which they are sorted in the report.

What It Means When an Item Score Reads “n/a”

You may sometimes see “n/a” instead of a score for an item. In some cases, the student did not respond to the item, or the item was not included in that form of the test.

View Aggregate Item Data in the Item Analysis Report

An Item Analysis Report is available for every fixed-form assessment that reports item data at the district, school, and class (roster) levels. Highlights of the Item Analysis Report include the following:

  • Item reporting categories and standard alignments

  • Item types (for example, multiple choice)

  • The percentage of the group of students who earned full, partial, and no credit on each item; if an item does not allow partial credit, “N/A” displays instead of a percentage.

To access the report, navigate to a District, School, or Roster Performance on Test report. In the Features & Tools menu, select Build Item Analysis Report. The Item Analysis window appears, open to the Summary tab (Figure 65).

image-20241011-223221.png

To expand a table row and display more detailed data, as in Figure 66, click the expand button on the right (refer to Icons). Click Show All Details to expand all rows. Click the Collapse Button or Hide All Details to collapse rows.

  • For multiple-choice items, the detailed data includes the distribution of students who selected each option, with the correct answer flagged with a checkmark (refer to Icons) .

  • For items that are not multiple choice or are multi-part, the detailed data includes the distribution of points earned.

image-20241011-223616.png

You can export the Summary tab to a comma-separated values (CSV) file using the Export menu at the top right. Select Basic Summary to include only the default table rows or Detailed Summary to include all the details in the expandable rows.

If item view is available for this test, you can navigate to the Single Item View tab (Figure 67) either by clicking it or by clicking the number to the left of a listed item. This tab displays detailed data on an item, and below that, the item itself, including the Rubric & Resources tab. You can navigate between items using the item number buttons on the left and right.

image-20241011-223924.png