The Item Report provides a curricular content breakdown at the level the test is recommended for. The Item report indicates what percentage of the group of students answered each question (item) on a test successfully. It also compares the success of the group on the questions with the success of students from the national reference group, at a given year level.
The report provides:
- class-wide response for making meaning without decoding
- question difficulty, text type, and links to each question/item
- information for question type
- a breakdown of how each student answered each question - wrong answers provide insight into thinking
Good curriculum knowledge will allow teachers to quickly make links behind the particular question/text type being tested by individual questions and the students' responses. To scaffold teacher curriculum knowledge, discussion about the skills and knowledge needed to answer a question correctly can support teachers to make formative decisions about how students think and approach text.
When reading the report it is important to identify which year level the group of students is being compared with. In Term 4 students will automatically be compared to the following reference year's Term 1 results. Remember that national performance on a question is only a reference point. A school might expect their students to consistently perform above the national proportions.
Using the Item Report
The amount of assessment information in the Item report can seem overwhelming. Teachers are not expected to look at every question and every student. The Item Report can be used to 'drill down' into the subject knowledge and skill to:
- Review results for the different text types and compare results - narrative vs others? any patterns?
- CLICK on individual question numbers to see
- the text students responded to
- the questions and their level of difficulty
- student responses - what answer did they choose - can you work out why - what are they doing?
- Use the Filters to uncover trends and patterns, strengths and weaknesses that exist in the test group
- Identify specific cohorts in specific areas of the subject who require specific teaching to target knowledge and skills.
- Follow hunches, 'informal conclusions' to identify actual needs
Reviewing questions where the group’s performance is well under the national performance, or which deviate from the patterns shown on other questions, may highlight particular areas for concern and further investigation.
CLICK ON THE QUESTION NUMBER to drill down into specific types of text and see the Individual Item Report. This allows users to see:
- the text students read - by reading the text teachers can better understand student responses
- the three question types
- the question difficulty
- Filters can provide specific cohorts for inquiry re gender and ethnicity for either support or extension, and target relevant teaching strategies to meet the needs.
Question types
The Individual Item Report codes the questions to indicate whether they involve retrieval, local inference or global inference
Retrieval questions require the reader to comprehend without needing to infer; that is, the reader matches the wording of the question to wording in the text.
Local inference questions require the reader to comprehend implied information from within relatively small sections of text.
Global inference questions require the reader to comprehend implied information from across relatively larger sections of text.
The question types provide some idea of the listening skills needed for the question involved. It might be that students who have problems with questions involving global inference for instance, require some help in this area. There are many reasons however why a student may have answered a particular question incorrectly and further investigation should occur before any definitive conclusions are reached. It is important to consider the text when examining performance on the different questions. Students' performance can be affected by their engagement in or knowledge of particular texts and text types.
Gaps and strengths
Evidence that a gap exists in a student’s knowledge could be indicated when a student has incorrectly answered a question that he or she was expected to find easy. Although it could also have been just a “silly mistake”, it should be followed up and investigated further. Similarly, when a student has correctly answered a question that was expected to be difficult it could be evidence that he or she has a particular strength in an area. Again, there could also be other reasons; for instance, the student may have simply made a lucky guess. It is possible that he has a gap in this area, which could be confirmed with further questioning.
NB: Each test covers two years worth of difficulty. If the questions students answered incorrectly are high on the scale, they may not represent gaps or next steps as students are not ready to understand the skills and knowledge associated.