Mathematics assessment in Queensland Schools needs improvement urgently.

1)  A central authority could outline the general requirements for a series of exam papers, such as having a suitable range of difficulty level, a range of theoretical and practical questions, familiar to unfamiliar, and a range from short response to extended.  There would be no need for the massive amount of criteria and verbal descriptors of standards that we have now.  After that, it should be up to the schools to set exams and use marks to assess the students and grade them.  Many schools believe that they must avoid using marks and use criteria sheets instead, which are matrices of cells containing verbal statements of standards.

QSA’s position is that marks can be used, but decisions on ratings cannot be made by simply using cutoff marks.  The standards matrices must be applied.  They insist that a mark of say 90% doesn’t guarantee an “A” standard of work because the student may not have fulfilled all of the “A”-descriptors in the syllabus.  However, the following extract from p32 of the Qld. MAB syllabus, produced by QSA, states that a standard can be obtained without necessarily ticking every descriptor.  I can’t see the difference:-
 
“When teachers are determining a standard for each criterion, it is not always necessary for the student to have met each descriptor for a particular standard; the standard awarded should be informed by how the qualities of the work match the descriptors overall”.

If we were simply able to award marks when assessing a well-set exam paper, it would be the job of review panels to check that suitable standards and balances have been maintained.  This is what happened before the QSA became such a powerful influence and advocate for non-marks assessment.  It worked well.  The best students got the best ratings.  We don’t need a cumbersome system in order to achieve that.  Mathematics teachers are good at setting suitably balanced assessment and also at awarding part marks for imperfect solutions and giving full credit for correct alternative solution methods.

The advantage of this would be greater simplicity in devising assessment tasks and in marking them and grading students.  There would be no loss of validity of the grades produced.  In classrooms, the change would produce better teaching and learning.  At the moment, assessment is driving the agenda, because of the many requirements to be satisfied by an “assessment package”.  That’s wrong – our emphasis should be on teaching and learning.  Young teachers including pre-service teachers, and also experienced teachers, have to spend large amounts of professional development time trying to understand QSA’s onerous assessment requirements.  But what we all need to be doing instead, is learning more about our subject and its applications and about the best ways of introducing topics to students and inspiring them in the subject.  We are prevented from using our time to the best pedagogical advantage, because of the time taken to embed a host of little detailed requirements into an “assessment package”.

The marks system is working well in NSW, Victoria, England, and other places, and in most Universities.  Some of QSA’s own QCS tests are assessed with a marks system.  It’s crazy that students suffer under QSA’s cumbersome system for school assessment, and then enter University to find that they are assessed simply and validly by using marks only.

Using marks allows teachers to differentiate between students more readily than by placing ticks in the cells of criteria sheets.  Advocates of the latter system place ticks toward the right or left edges of the cells in the criterion sheet, or in the centre, depending on how well they judge the student to have met the particular descriptor.  That’s all very subjective and unreliable.  Awarding marks according to a marking scheme, and giving credit for alternative solutions which are different from the adopted marking scheme, produces fair and defensible judgments and allows teachers to rank students in order of merit when required to do so.

2) At present there is a plethora of criteria and descriptors of standards in the syllabuses. There are dot-points such as “comment on the strengths and limitations of models, both given and developed” or “identification of assumptions and their associated effects, parameters and/or variables” which require understandings that are sometimes more appropriate to tertiary level studies.  It’s a more urgent priority for high school students to gain and apply mathematical skills to make predictions or solve problems, but these other idealistic dot-point requirements draw away from the time available in school for students to develop basic appreciations of their subject.  The tragedy of the current situation is that if a school should happen to omit even one of these micro-specifications from their assessment “package”, their proposed ratings will be reduced by review panels, even though the level of challenge in the school’s questions may be higher than that of schools who have diligently ticked all the boxes by attending to the many little dot-points.  When exam questions are designed with lots of tick-boxes in mind, the assessment can become somewhat stilted and artificial.  Questions need only be graded in level of challenge, and to be broadly divided into a basic level, suitable for students to obtain a basic “Pass”, and a higher level that would examine problem-solving skills in a range of situations and allow people to achieve the highest ratings.

3) At the website below this paragraph, you will see the newest incarnation of the standards to be applied in Years 1-10, in this case Year 1.  They are called “Standards Elaborations” and they are the means by which teachers are expected to assess their Year 1 students.  By changing the “1″ where it says “yr1″ in the URL, you can get to the other year levels:

http://www.qsa.qld.edu.au/downloads/p_10/ac_math_yr1_se.pdf

If you compare the descriptors in say columns 1 and 2 for Year 1, you might agree with me that it would be very hard for a Year 1 teacher, or anyone else, to decide which column is most appropriate for a particular student’s responses.  Who wants their children or grandchildren to bring home a results sheet like this, or a single letter rating derived from this matrix?  It would be far superior and more useful for the child to bring home a spelling or other test or project marked “18/20″ with the mistakes highlighted, etc.  Marking would be much easier and more reliable, and families could easily see how the child is progressing.  Yet this criterion sheet system is now being used throughout Years 1-10 in our State.

4) External exams would be better than the patchwork quilt system that Queensland has now, where schools all set different exams from each other, and no-one is really sure whether questions marked “unfamiliar” are really so.  The NSW system combining school assessment with the HSC works very well, and it should be easy for Qld to move to that kind of system.  It ensures more reliable comparability of students from different schools.  As the National Curriculum is implemented throughout Qld schools, it makes sense for us to adopt a national assessment approach.  QSA have been setting good external Senior exams in Qld, but have announced that they will discontinue them.  It seems easily possible that they could continue to set them for full-scale use in schools.  Money saved on review panels could be used to pay for teams of markers.

5) QSA claims that Queensland’s assessment system is “world’s best practice”.  There is no objective evidence to support this.  It is the belief of academic theorists.  No other education system has conferred this compliment on QSA – it is merely a wistful statement by insiders, and therefore of little value.  If it were world’s best practice, why haven’t NSW, Victoria, the UK, etc, converted to it?  Why don’t our University Maths departments adopt it?

Let’s improve mathematics teaching in the schools by freeing up teachers’ time so they can concentrate on improving pedagogy and inspiring students, instead of suffocating in a straitjacket of unnecessary assessment requirements.  Let parents once again receive clear simple statements of marks earned by their children.

Marks in Senior Maths

We still use marks in all of our Maths assessment in both KAPS and MAPS. To collect the data on students. I know this is frowned upon by QSA. However, I find marks are the best way to give student feedback on progress and also give appropriate “weighting” to various questions. In addition, it automatically weights the various assessment items.

Most schools use the following format each semester:

Mid semester test
A EMPS (ie assignment)
End Semester test

Students are awarded a standard in each of the 3 criteria (KAPS, MAPS and CAJ) on each item. If one “averages” these standards there is an implicit equal weighting to each of the 3 items. However, in the vast majority of cases, the EMPS is on a “smaller amount of work” As a rule of thumb, my tests are 40% each and the EMPS is 20% of the total marks.

I am happy to retain the EMPS.

Talking to the science teachers in the staffroom, YES the EEI’s are very large and there is an expectation on size and they would certainly appreciate an expectation of a shorter/smaller EEI’s. However, they are keen on retaining same. Many or most would be happy to use marks.

On School Maths

Perhaps you too, have noticed that school graduates no longer know their times-tables, cannot add fractions or do long division? Even at [University name removed], we are finding that the students starting science and engineering degrees are not confident with standard mathematical skills. 

Mathematics is the language of the physical world. Science and technology rely on mathematics. Studying mathematics develops sound reasoning, and has been a core discipline pursuing clarity of thought for thousands of years. So why are we presently failing to pass on this gift now? 

1. What is wrong?

 Maths teachers with over twenty years experience in Queensland, and those teachers who have also taught in other systems, can readily explain what is wrong with our school system:

 The reason students do not know their times-tables is because our teachers of maths have been instructed not to have students memorise facts! The reason students do not know how to add fractions, and do not know how to do long division, and do not have confidence in doing mathematical procedures, is because teachers of maths, at all school levels, have been instructed to de-emphasize the standard algorithms, and not to use repetition. 

No, it’s not a terrorist giving our teachers these instructions. It’s the recent fashion of educational ideology endorsed by our educational theorists. This ideology is attributed to the 1950s psychologist, Bloom. He regards activities such as remembering and understanding as  ‘lower order’ while activities like application and evaluation are considered ‘higher order’. 

Bloom’s theory of ‘higher-order-thinking’ may have appeal in some sectors, but it is not suited to mathematics, since mathematics, much like learning to play a musical instrument, requires years of practice and repetition.  Following Bloom, our school maths has become instead, like one of those ‘musical appreciation courses’ where students are briefly exposed to a sweeping range of topics, but never really learn themselves, how to play. Would you rather hear your child say “some people can do it” or hear them say “I can do it”? 

With our emphasis on so-called ‘higher-order’ thinking, we have neglected the basics. This has been disastrous for learning maths as maths builds upon itself from one year to the next: Calculus relies on advanced algebra, which relies on simple algebra, which relies on standard arithmetic, which relies on knowledge of the times-tables. Only half-knowing maths one year means only one-quarter-knowing it the next year, and only one-eighth-knowing it the year after that, and so on, until you’re having nightmares about arriving at school on the day of the exam, completely unprepared and without any clothes on. 

I can’t help but think that Bloom’s followers will not consider maths as ‘higher-order’ until they have turned maths into something it is not. It appears to me that university-level mathematics is still considered a ‘lower-order’ activity according to Bloom’s taxonomy.

 Bloom’s ideology also inhibits developing maths skills in our schools through the introduction of written assignments in maths, and the insistence on use of muti-media and technology. These things do not build basic mathematical skills, anywhere nearly as well as doing regular homework and studying for an exam. Written assignments keep everyone busy, and basic maths is left out.
High school Chemistry and Physics are also now being distorted by the inappropriate introduction of very long written assignments. 

The imposition of Bloom’s ideology also creates much red tape. The paperwork requirements placed on teachers waste so much time that they are obstructive to students’ learning.  For example, when a teacher marks a maths test, he/she is forbidden from the standard practice of awarding a (number) mark for each question and adding these up to get a total score.

 Instead, for each question, teachers must award letter grades, over three different categories. The appropriate letter is to be chosen by reading and considering perhaps fifty (50) paragraphs of descriptors. To give you an idea, here is one  such paragraph: 

“The student work has the following characteristics:
use of problem solving strategies to interpret, clarify and analyze problems to develop responses to routine and non-routine simple tasks in life-related or abstract situations”

 So a task, which is done by every teacher, for every student, on every piece of assessment, which should be simple and routine, is in Queensland, not simple at all, but instead a festival of cultural deliberation, After all these ‘festivities’, the mystery of how to combine the letter grades begins.  Later on, this combination emerges somehow transfigured, on the report to parents, as one of maybe five uncomfortably-worded sentences.  The whole process proceeds officially uncontaminated by numbers.
” How is Johnny going in maths?” remains the question on everyone’s lips.

  2.  How can we fix it?

 This article has briefly indicated how the imposition of Bloom’s ideology on our teachers, has prevented a generation of our youth from gaining maths skills. 

‘Education theory’ and Psychology are relatively new and speculative areas of study, purporting frequently changing ideas. In hindsight, we might question why we ever placed an educational theorist into a position of authority over the  learning mathematics. It doesn’t seem appropriate to subject a whole population to unproven ideas of a speculative nature. People have been learning mathematics for thousands of years. One would think that traditional approaches would be safer and more reliable. 

The key to fixing this problem is to have experts in the actual discipline of study responsible for the curriculum and assessment of that discipline, rather than appointing those who imagine that every kind of learning is the same. When it comes to mathematics, an appropriate panel of experts might consist of very experienced maths teachers, engineers and mathematicians. Physicists, chemists and Economists might also suit. (Caution: degrees called ‘mathematics education’ generally consist of only a little or no mathematics, and a lot of ‘education’.) 

However we decide to restructure, and who ever we appoint,  the new body governing mathematics in school must be answerable to someone, unlike the Queensland Studies Authority, which was set up as a statutory body, answerable only to itself.

 I feel that we should keep state sovereignty over education as much as possible, even though the proposed national curriculum looks better than our present one. My reason for this is, that if or when the national education bodies begin to move down silly paths, then it will be so much more difficult to turn them around. Will the national body appoint people who do mathematics, or people who do education? 

To conclude, there is some good news:
We can be sure that our current low performance in maths is not due to any intrinsic or innate stupidity,

  1. this problem can be solved,  and
  2. it is not an issue of needing to spend more time or money.

 A good mathematics course will build a student’s confidence in his or her own ability to reason clearly and correctly. After completion, a student may go on to apply this ability to his or her chosen pursuits in life.  Shall we pass on this gift to  the next generation?