The Item Report provides a curricular content breakdown at the level of the curriculum the test is recommended for. The Item report indicates what percentage of the group of students answered each question (item) on a test successfully. It also compares the success of the group on the questions with the success of students from the national reference group at a given year level.  


The report provides links to each question/item plus a breakdown of how each student answered each question providing in-depth diagnostic knowledge of individual students' knowledge and skill in specific areas of the subject.  Good curriculum knowledge will allow teachers to quickly make links behind the particular question/text type being tested by individual questions and the students' responses, to make formative decisions about  how students think and approach text.

When reading the report it is important to identify which year level the group of students is being compared with. In Term 4 students will automatically be compared to the following reference year's Term 1 results.  It is important to remember that national performance on a question is only a reference point. A school might expect their students to consistently perform above the national proportions.


Using the Item Report


The amount of assessment information in the Item report can seem overwhelming. Teachers are not expected to look at every question and every student. The Item Report can be used to 'drill down' into the subject knowledge and skill to:

  1. Review results for the different text types  and compare results - narrative vs others? any patterns?
  2. CLICK   on individual question numbers to see
    • the text students responded to
    • the questions and their level of difficulty
    •  student responses - what answer did they choose - can you work out why - what are they doing?
  3. Use the Filters to uncover trends and patterns, strengths and weaknesses that exist in the test group
  4. Identify specific cohorts in specific areas of the subject who require specific teaching to target knowledge and skills.
  5. Follow hunches, 'informal conclusions' to identify actual needs


Reviewing questions where the group’s performance is well under the national performance, or which deviate from the patterns shown on other questions, may highlight particular areas for concern and further investigation. 


 CLICK ON THE QUESTION NUMBER to drill down into specific types of text and see the  the Individual Item Report. This allows users  to see:

  • the text students read - by reading the text teachers can better understand student responses
  • the  three question types
  • the question difficulty
  •  Filters  can provide specific cohorts for inquiry re gender and ethnicity for either support or extension, and target relevant teaching strategies to meet the needs. 

Question types 
The Individual Item Report  codes the questions to indicate whether they involve retrieval, local inference or global inference 

  • Retrieval questions require the reader to comprehend without needing to infer; that is, the reader matches the wording of the question to wording in the text. 

  • Local inference questions require the reader to comprehend implied information from within relatively small sections of text. 

  • Global inference questions require the reader to comprehend implied information from across relatively larger sections of text. 

The question types provide some idea of the reading skills needed for the question involved. It might be that students who have problems with questions involving global inference for instance, require some help in this area. There are many reasons however why a student may have answered a particular question incorrectly and further investigation should occur before any definitive conclusions are reached. It is important to consider the text when examining performance on the different questions. Students' performance can be affected by their engagement in or knowledge of particular texts and text types. 


Gaps and strengths 
Evidence that a gap exists in a student’s knowledge could be indicated when a student has incorrectly answered a question that he or she was expected to find easy. Although it could also have been just a “silly mistake”, it should be followed up and investigated further. Similarly, when a student has correctly answered a question that was expected to be difficult it could be evidence that he or she has a particular strength in an area. Again, there could also be other reasons; for instance, the student may have simply made a lucky guess.   It is possible that he has a gap in this area, which could be confirmed with further questioning. 

NB: Each test covers two years worth of difficulty. If the questions students answered incorrectly are high on the scale, they may not represent gaps or next steps as students are not ready to understand the skills and knowledge associated.