Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

Interim assessments contain secureitems that educators are able to view in CRS, but that should not be printed, downloaded or shared in any way. Reports for individual interim tests may include the following:

  • Item-level data

  • Access to the items themselves

  • Access to student responses to the items

Test results for adaptive assessments include item-level data only on the individual student level.

Table of Contents
minLevel1
maxLevel6
outlinefalse
typelist
printablefalse

How to View Item Scores

To expand sections containing item data, click the vertical section bars as in Figure 54.

How to Learn Which Items Students Performed on Best or Struggled with Most

Look in the sections, 5 Items on Which Students Performed the Best and 5 Items on Which Students Performed the Worst (refer to Figure 55). You can click the vertical section bars to expand them, as in other sections.

How to View Standards for Each Item

In a report displaying item-level data, you can view the standard or standards to which each item is aligned. This allows you to determine at a glance what the item measures.

To show and hide item standards, click the Standards Keys Toggle button in the row of filter details below the report table heading (refer to Icons). Under each item number appears a standard key or list of standard keys (refer to Figure 54). Note that this toggle does not affect printouts or exports, which always include the standard keys when they include item-level data.

Click the More Information button beside the standard keys to view legends displaying the full text of each cluster (category of standards) and each standard, as in Figure 57 (refer to Icons). This full text is not included in printouts or exports.

Table 23. District Performance on Test Report Elements

#

Element

1

Standard Keys toggle

2

Standard keys below item numbers

How to View an Item

You can view the actual items themselves, along with student responses to those items.

Do either of the following (refer to Figure 58):

  • To view the item in a bank, click the item number in the first row of the report table.

  • To view the student’s response to the item, find that student’s name in the Student column on the left. Then click the score the student obtained on that item.

The Item View window appears (refer to Figure 59). It contains an Item & Score tab and a Rubric & Resources tab. A banner at the top of the window displays the item’s number, score (when the item includes the student’s response), and confidence level (when a machine-suggested score has a low confidence level). The Item & Score tab shows the item and may include a particular student’s response.

The Item & Score tab may include the following sections:

  • Scoring Assertion: Each scoring assertion contains both a statement that provides information about what the student did in their response, and the content knowledge, skill, or ability that is evidenced by their response. When you are viewing a student’s response and the item has scoring assertions, the Scoring Assertion table appears, listing each assertion and outcome (refer to Figure 58).

  • Item: Displays the item as it appeared on the assessment in the Student Testing Site. For items associated with a passage, the passage also appears.

  • The Rubric & Resources tab (refer to Figure 59) may include the following sections, which you can expand and collapse by clicking Expand button and Collapse button, respectively (refer to Icons).

  • Details: May provide the following information:

    • Topic: Skill area to which the item belongs

    • Content Alignment: Describes the standard to which the item is aligned

  • Resources: Provides links to any exemplars or training guides available for the item

  • Rubric: Displays the criteria used to score the item. This section may also include a score breakdown, a human-readable rubric, or an exemplar, which provides an example of a response for each point value.

  • Frequency Distribution of Student Responses: The table in this section provides a breakdown of how many students in the campus earned each possible point value available for a fixed-form test item.

Table 24. My Students' Performance on Test Report: Performance by Student Tab Elements

#

Element

1

Item number (click to view item without student response)

2

Item score for a particular student (click to view item with student response)

Table 25. Item View Window Elements

#

Element

1

Item & Score tab (selected)

2

Rubric & Resources tab

How to View Items With and Without the Students’ Visual Settings

When viewing items with students’ responses, you may or may not want to see the items exactly the way the students saw them on the test. For example, some students’ tests are set to use large fonts, different color contrast, or Spanish.

  1. Click the Features and Tools menu in the banner and select Set Student Setting on Item View (refer to Figure 62). The Set Student Setting on Item View window appears (refer to Figure 63).

  2. Select Yes to show students’ visual settings on all items or No to hide them.

  3. Click Save.

You can also show or hide visual settings on a per-item basis. To do so, click the toggle at the upper-right of the item you are viewing (refer to Figure 64). This action has no effect on your global setting.

Image RemovedImage Added

How to Navigate to Other Items from the Item View Window

Use the buttons Previous Item and Next Item labeled with the previous and next item numbers at the upper corners of the Item View window (refer to Icons).

How to View Another Student's Response to the Current Item

If you have accessed the student’s response from a report showing multiple students, you can click the arrows beside the Demo Student Field button at the top of the window (refer to Icons). Students are listed in the same order in which they are sorted in the report.

What It Means When an Item Score Reads “n/a”

You may sometimes see “n/a” instead of a score for an item. In some cases, the student did not respond to the item, or the item was not included in that form of the test.