The Individual Report maps how well a student has performed on the different questions within a test.  

The Individual Report  displays the question numbers against the PAT:Mathematics scale (patm). The questions are grouped according to the Strand they are associated with. The questions are positioned against the scale according to the level of skill and knowledge required to answer them. In Adaptive testing, the student drives the choice of questions and the scale range of questions is less and students will get up to 60% correct.

The Report Key explains student responses:

  • Black circle - answered correctly
  • White circle - answered incorrectly
  • Grey circle - omitted

The student's scale position is shown by the dotted line which intersects the scale and the stanine score distributions for three different year levels.

  • The blue highlighting around the dotted line is used to indicate the margin of error associated with the student’s score. If the test could be repeated, we would expect the student to score in the range indicated by the highlighting, about two thirds of the time. Students who achieve very highly or very poorly on a test will have a larger error associated with their score and is less reliable. 

Highly capable or struggling students should sit the test that matches their ability level or the PAT:Mathematics Adaptive test which quickly adapts to their ability level.

Typically, a student is more likely to answer correctly the questions listed below the line than above it. When a question is located well below the line there is a strong expectation that the question will be answered correctly. In contrast, it is very unlikely that a question located well above the line will be answered correctly. Each test covers at least a two year range of difficulty. CLICK  on a question number to reveal the question and the class responses.

By CLICKING on the SHOW QUESTION button in the righthand menu, users can switch between List View & Strand View below the graph.

Show Questions reveals a second page below containing question descriptions, organisation and ordering functions.        

  •  Show List View :  order for scale difficulty of questions, order numerically, group questions that are related, order  question descriptions by verb
  • Show Strand View: questions organised in Strands, order for scale difficulty, order numerically, order question descriptions by verb


Each question can be displayed by clicking on a question number. This opens the Individual Item Report for that question, which displays the actual question as presented in the test and provides a range of information about the question, and displays the multi-choice options selected by students. It also provides links to similar questions in the test.



Gaps and strengths
Evidence that a gap exists in a student’s knowledge could be indicated when a student has incorrectly answered a question that he or she was expected to answer correctly (below or around their scale score line).  Although it could also have been just a “silly mistake”, it should be followed up and investigated further.  

Similarly, when a student has correctly answered a question that was expected to be answered incorrectly it could be evidence that he or she has a particular strength in an area.  Again, there could also be other reasons; for instance, the student may have simply made a lucky guess.   It is possible that he has a gap in this area, which could be confirmed with further questioning, depending on the question difficulty. It may be a concept the student is not ready for.

 NB:When students sit tests that are too easy, there are no questions at their scale level and users cannot define next steps.

        When students sit tests that are too hard, their responses are unreliable (guesses) and users cannot define next steps.


Comparing Performance across Strands
The Individual Report can be used to provide an indication of how a student has performed in the different Strands.  However, it is important to note that each Strand has a different number of questions and that these questions vary in difficulty.  This makes it unwise to make direct comparisons between Strands on the basis of the number of questions answered correctly.  There is evidence that a student is performing less successfully in a Strand when he or she is unable to answer the majority of the questions in the Strand that are located below the student’s location on the scale.