The Item Report provides a curricular content breakdown at the level of the curriculum the test is recommended for. The Item report indicates what percentage of the group of students answered each question (item) on a test successfully. It also compares the success of the group on the questions with the success of students from the national reference group at a given year level.  


The report provides links to each question/item plus a breakdown of how each student answered each question providing in-depth diagnostic knowledge of cohort/individual students' knowledge and skill in specific areas of the subject.  Good curriculum knowledge will allow teachers to quickly make links behind the particular question/text type being tested by individual questions and the students' responses, to make formative decisions about  how students think and approach text.


When reading the report it is important to identify which year level the group of students is being compared with. In Term 4 students will automatically be compared to the following reference year's Term 1 results.  NB: national performance on a question is only a reference point. A school might expect their students to consistently perform above the national position.


Using the Item Report


The amount of assessment information in the Item report can seem overwhelming. Teachers are not expected to look at every question and every student. The Item Report can be used to 'drill down' into the subject knowledge and skill to:

  1. Review results for the different text types  and compare results - narrative vs others? any patterns?
  2. CLICK   on individual question numbers to see
    • the text students responded to
    • the questions and their level of difficulty
    •  student responses - what answer did they choose - can you work out why - what are they doing?
  3. Use the Filters to uncover trends and patterns, strengths and weaknesses that exist in the test group
  4. Identify specific cohorts in specific areas of the subject who require specific teaching to target knowledge and skills.
  5. Follow hunches, 'informal conclusions' to identify actual needs


Reviewing questions where the group’s performance is well under the national performance, or which deviate from the patterns shown on other questions, may highlight particular areas for concern and further investigation. 


 CLICK ON THE QUESTION NUMBER to drill down into specific types of text and see the  the Individual Item Report. This allows users  to see:

  • the contextual sentence
  • the multi-choice answers
  • the students choices across the multi-choice answers
  • the link to the ARBs (Assessment Resource Banks)
  •  Filters  can provide specific cohorts for inquiry re gender and ethnicity for either support or extension, and target relevant teaching strategies to meet the needs. 


Gaps and strengths 
Evidence that a gap exists in a student’s knowledge could be indicated when a student has incorrectly answered a question that he or she was expected to find easy. Although it could also have been just a “silly mistake”, it should be followed up and investigated further. Similarly, when a student has correctly answered a question that was expected to be difficult it could be evidence that he or she has a particular strength in an area. Again, there could also be other reasons; for instance, the student may have simply made a lucky guess.   It is possible that he has a gap in this area, which could be confirmed with further questioning. 

NB: Each test covers two years worth of difficulty. If the questions students answered incorrectly are high on the scale, they may not represent gaps or next steps as students are not ready to understand the skills and knowledge associated.