The Item Report provides a curricular content breakdown at the level  the test is recommended for. The Item report indicates what percentage of the group of students answered each question (item) on a test successfully. It also compares the success of the group on the questions with the success of students from the national reference group, at a given year level.  


The report provides:

  • class-wide response for making meaning without decoding
  • question difficulty, text type, and links to each question/item 
  • information for question type 
  • a breakdown of how each student answered each question - wrong answers provide insight into thinking


Good curriculum knowledge will allow teachers to quickly make links behind the particular question/text type being tested by individual questions and the students' responses. To scaffold teacher curriculum knowledge, discussion about the skills and knowledge needed to answer a question correctly can support teachers to make formative decisions about  how students think and approach text.

When reading the report it is important to identify which year level the group of students is being compared with. In Term 4 students will automatically be compared to the following reference year's Term 1 results.  Remember that national performance on a question is only a reference point. A school might expect their students to consistently perform above the national proportions.


Using the Item Report


The amount of assessment information in the Item report can seem overwhelming. Teachers are not expected to look at every question and every student. The Item Report can be used to 'drill down' into the subject knowledge and skill to:

  1. Review results for the different text types  and compare results - narrative vs others? any patterns?
  2. CLICK   on individual question numbers to see
    • the text students responded to
    • the questions and their level of difficulty
    •  student responses - what answer did they choose - can you work out why - what are they doing?
  3. Use the Filters to uncover trends and patterns, strengths and weaknesses that exist in the test group
  4. Identify specific cohorts in specific areas of the subject who require specific teaching to target knowledge and skills.
  5. Follow hunches, 'informal conclusions' to identify actual needs


Reviewing questions where the group’s performance is well under the national performance, or  deviate from the patterns shown on other questions, may highlight particular areas for concern and further investigation.





CLICK ON THE QUESTION NUMBER to drill down into specific types of grammar and punctuation and see the   Individual Item Report. This allows users  to see:

  • the questions students read - by reading the question teachers can better understand student responses in relation to the context
  • the  question types
  • the question scale difficulty
  • Student responses
  •  Filters  can provide specific cohorts for inquiry re gender and ethnicity for either support or extension, and target relevant teaching strategies to meet the needs.


Gaps and strengths 
Evidence that a gap exists in a student’s knowledge could be indicated when a student has incorrectly answered a question that he or she was expected to find easy. Although it could also have been just a “silly mistake”, it should be followed up and investigated further. Similarly, when a student has correctly answered a question that was expected to be difficult it could be evidence that he or she has a particular strength in an area. Again, there could also be other reasons; for instance, the student may have simply made a lucky guess.   It is possible that he has a gap in this area, which could be confirmed with further questioning. 

NB: Each test covers two years worth of difficulty. If the questions students answered incorrectly are high on the scale, they may not represent gaps or next steps as students are not ready to understand the skills and knowledge associated.