The Individual Report maps how well a student has performed on the different questions within a test. Although the Individual Reports for all PATs/STAR/S:TwE can look different, they are all built on a set of common concepts. When viewing the individual report for PAT:Mathematics, users view the graph displaying student responses and statistics.
By CLICKING on the SHOW QUESTION button in the righthand menu, users can switch between List View & Strand View (see below).
The Individual Report displays the questions against the PAT:Mathematics scale (patm). The questions are grouped according to the Strand they are associated with. The questions are positioned according to their location on the PAT:Mathematics scale determined by the level of skill and knowledge required to answer them.
The Report Key explains student responses:
- Black circle - answered correctly
- White circle - answered incorrectly
- Grey circle - omitted
- The student's scale position is shown by the dotted line which intersects the scale and the stanine score distributions for three different year levels.
- The blue highlighting around the dotted line is used to indicate the margin of error associated with the student’s score. If the test could be repeated, we would expect the student to score in the range indicated by the highlighting, about two thirds of the time. Students who achieve very highly or very poorly on a test will have a larger error associated with their score and is less reliable.
- Highly capable or struggling students should sit the test that matches their ability level or the PAT:Mathematics Adaptive test which quickly adapts to their ability level.
Typically, a student is more likely tocorrectly answer the questions listed below the line than above it. When a question is located well below the line there is a strong expectation that the question will be answered correctly. In contrast, it is very unlikely that a question located well above the line will be answered correctly. CLICK on a question number to reveal the question and see the class responses.
Show Questions reveals a second page below containing question descriptions, organisation and ordering functions.
- Show List View : order for scale difficulty of questions, order numerically, group questions that are related, order question descriptions by verb
- Show Strand View: questions organised in Strands, order for scale difficulty, order numerically, order question descriptions by verb
Each question can be displayed by clicking on a question number. This opens the Individual Item Report for that question, which displays the actual question as presented in the test and provides a range of information about the question, and displays the multi-choice options selected by students. It also provides links to similar questions in the test.
Gaps and strengths
Evidence that a gap exists in a student’s knowledge could be indicated when a student has incorrectly answered a question that he or she was expected to answer correctly. Although it could also have been just a “silly mistake”, it should be followed up and investigated further.
Similarly, when a student has correctly answered a question that was expected to be answered incorrectly it could be evidence that he or she has a particular strength in an area. Again, there could also be other reasons; for instance, the student may have simply made a lucky guess. E.G. If the student has answered Questions 7 incorrectly (it is presented as a non-filled circle). Given the student’s overall level of achievement this question was expected to be answered correctly. It is possible that he has a gap in this area, which could be confirmed with further questioning.
NB:When students sit tests that are too easy, there are no questions at their scale level and users cannot define next steps.
When students sit tests that are too hard, their responses are unreliable (guesses) and users cannot define next steps.
Comparing Performance across Strands
The Individual Report can be used to provide an indication of how a student has performed in the different Strands. However, it is important to note that each Strand has a different number of questions and that these questions vary in difficulty. This makes it unwise to make direct comparisons between Strands on the basis of the number of questions answered correctly. There is evidence that a student is performing less successfully in a Strand when he or she is unable to answer the majority of the questions in the Strand that are located below the student’s location on the scale.