Stress, Workload and Assessment

It is concerning that to me that QSA seem to be trying to introduce this nonsense into the assessment criteria for Year 10 under the Australian Curriculum.  We certainly don’t need this level of workload in Year 10, as well as 11 and 12.

It is also interesting to note that part of the agenda for the upcoming Panel training sessions in Mathematics includes a section entitled “Roles and Responsibilities”, where the topic for discussion is “Supporting and Advocating for externally moderated school based assessment”. Although, I guess that doesn’t necessarily suggest the nature of that school based assessment.

Two big concerns in all of this are the stress that this form of assessment puts on students, and equally important, the considerably increased work-load that was brought in under the radar for teachers, without consultation, and ultimately without any financial compensation either.

The ultimate concern however is that the immediacy of feedback from assessment, probably the main purpose of assessment, has been lost in a maze of criteria sheets that at the end of the day mean little to the students. As a typical example, if I have 25 students in a 12 Maths B class, and a typical exam takes about 45-60 minutes to mark thoroughly and determine an outcome, I am faced with up to 25 hours of marking before students get any feedback at all. Assignments are even worse, and the task means little to students by the time they get a task returned.

As a teacher who teaches by necessity 11 Maths B, 12 Maths B and 12 Maths C, along with a year 10 class, the assessment workload is becoming horrendous. For students doing a subject combination like Maths B, Maths C, Physics and Chemistry, I can only imagine the stress this system places upon them.

Laborious and Subjective

I would like to voice my wholehearted support for this movement for QSA to review their assessment policies for Maths and Science.

I have been a teacher of both maths and science (main area is Senior Physics) for 17 years.

In all of my experience to date, I have been involved with systems that have both marks and criteria based methods of assessment.  However I have never come across such a laborious and subjective system as what QSA has implemented since 2007 and prior to this.  Such criteria are totally open to interpretation and are really a play-on-words (eg: exploration….explanation…..analysis….evaluation….etc, etc).

I also wish to express my deep concern about the word length and expectations of the EEI’s and ERT’s, particularly as a panelist where you get to see EEI’s that are produced by some students in some of the more “exclusive” schools in the area (no doubt with help from a well-paid tutor).  Such submissions tend to cloud the view of a subject panelist to “expect” this type of response from an “A-level” student.

I would welcome any opportunity to attend any future meetings, and I am more than willing to put my signature and support to moves towards making QSA review these procedures, in favour of a more common-sense and reasonable approach to assessing “16 and 17 year olds”, and not “university honours or thesis students” which is what these EEI’s and ERT’s have seemed to evolved into.

Why QSA’s assessment system is wrong

Currently all assessment in Queensland schools are school based, meaning each teacher sets the exam papers, marks them and grades their students. In my opinion this form of assessment lends itself to many abuses and makes comparison of performance between schools meaningless. Listed below are 12 of my concerns. I must stress that none of these concerns is as a result of any incident that happened at the school where I am currently teaching.

  1. Schools have the opportunity to offer revision sheets that have direct similarities to the exam students sit a few days later. Also, some teachers, during revision, work through questions similar to those that appear in the exam.
  2. Most schools recycle a substantial percentage of an exam paper year after year. Students are able to access their exam papers after term 1 of the following year, raising issues of future student familiarity with the assessment items provided by the school.
  3. All members of the teaching staff have electronic access to assessment items stored on the network. It is impossible to know when an unauthorised copy has been made. A similar situation occurs when paper copies of exams are stored in unsecure areas accessible by all staff and in some cases by students. Some of these staff have relatives studying at the school, and some tutor students from the school.
  4. There have also been instances where students have managed to gain access to staff drive on the school computer system and copy supposedly secure files.
  5. Old exam papers and draft versions of upcoming exam papers are often not disposed of securely. It is not uncommon to see students rummaging through school rubbish bins prior to an exam period.
  6. Inconsistency in teacher-marking and in levels of exam difficulty between schools within a district and between districts is impossible to monitor or moderate.  A student’s grade depends more on which school they attend or who their teacher is and not on what they know.
  7. It is difficult for a panellist to determine whether challenging/complex problems included in an assessment package have been partially or fully rehearsed.
  8. The level of assistance provided to students in an exam is often not disclosed. ‘Assistance’ changes the conditions of the exam and is difficult for the panellists to gauge. Some teachers help students during exams. Some schools allow their students to bring with them one or two pages of ‘cheat sheets’ containing formulae, examples, definitions, graphs and diagrams. These sheets are not attached to the student’s answer sheets when they are sent to the panel. In one case I know (not at the current school), two pages containing differentiation and integration formulae with worked examples were handed to the students during a Maths B exam and was not attached to the answer sheets when sent to the panel.
  9. Teachers know what questions are in the exam and, when talking to students, are placed in the situation of reassuring students without jeopardising the ‘unseen’ nature of an exam.
  10.  Sometimes students doing the same exam are allowed to sit next to each other. A quick glance at the neighbour’s graph could mean the difference between getting an A and an E in that question.
  11.  Students, who are unable to do an exam on the scheduled day due to being sick or being on holidays, sit the exam on their return. These students have the opportunity to find out from their fellow students what questions appeared in the exam.
  12.  Students are made to feel they mustn’t upset their teacher in any way, in case they are penalised for their behaviour when their papers are marked.

The current system is unfair to students, teachers and the schools. It is naive to assume that breaches are easily detected at moderation/verification. The panel does not have the time, resource or sometimes the expertise to investigate all possible breaches. Even if breaches are detected, verification occurs in October, making it difficult for schools to fix any problems, and limiting opportunities for students to suddenly demonstrate their full potential.

It is reasonable to assume that most teachers will conduct themselves in a professional and ethical manner most of the time. But it isn’t reasonable to assume that all teachers will conduct themselves in a professional and ethical manner all of the time, especially when their performance as a teacher is often judged by the number of VHA and HA students they produce.

I also keep asking myself why it is that after 40+ years of all the ‘educational benefits’ the moderated school-based assessment system has brought to Queensland, the overwhelming majority of school authorities around the world, including those in the other five states in Australia, are continually refusing to adopt it.

A totally school based assessment is unreliable and makes comparison of student achievement meaningless. Without some form of external assessment, we do not have a full picture of current standard of student performance or school performance. The National Curriculum must include clear statements on assessment and not leave it entirely to organisations such as the QSA which favour a totally school based system.

As a parent and teacher

As a parent of high school children, and a teacher at the primary school level, I concur with all that has been articulated thus far and applaud your efforts and time on this issue. As an example, one of my children studies Biology in Yr12 and was required recently to complete an assessment on [topic removed]. An inordinate amount of his time as well as mine (because of his limited knowledge (aged 16/17) of the intricacies of this very complex topic) was spent trying to find the right set of arguments with which to tackle this issue, before he proceeded to research material/documents and then begin the writing process. From my involvement in this, it became an exercise in how well one could create a splendid writing piece that clearly articulated the argument/s taken. This became such a grind for him, even though he excels at English/Writing, to the point that he was becoming ‘disillusioned’ with Biology and where it was taking him. It appeared to be more about the research and writing process of producing a coherent piece of work and less about the ‘nuts and bolts’ of this type of engineering, as fascinating as it is. In my view, high school learning should be more about acquiring the content of particular learning areas so that a firm foundation of knowledge is laid, prior to any tertiary studies. Without the requisite core knowledge and ‘inspiration’ for further research and learning beyond the school system, we are sadly doing a disservice to our young people. With respect to my teaching at the primary school level, there is sufficient evidence of similar assessment methods being implemented by the QSA e.g. QCATs(Queensland Comparable Assessment Tasks – for Yr 4, 6 & 9 students), as well as the new (Qld only) C2C (Curriculum into the Classroom) implementation of the National Curriculum agenda. A grossly convoluted set of assessment methods exists for determining the ultimate grade level received by a child on their report card. QCAT assessment tasks are a creation that is unique to the QSA. Teachers conducting these tasks with students, spend a disproportionate amount of time administering and marking these, versus the benefit to students. The whole aim of conducting QCATs (according to QSA) is purely as a moderation process for teachers across the state. ie. is an ‘A’ grade or a ‘B’ grade, the same for the same piece of work, for all teachers. Parents then have the ‘pleasure’ of seeing these results on their child’s report card, and are confounded by its relevance to their child’s real abilities. The marking of these assessment tasks can take up to three weekends of a teacher’s time to complete, if done as prescribed by the GTMJ (Guide To Making Judgements) marking guide/process. This precious time would be better spent creating innovative and inspiring lessons for our ‘younger minds’. The problem is, those in the QSA are not on the ground listening to people like us at the coalface of teaching.

Marks in Senior Maths

We still use marks in all of our Maths assessment in both KAPS and MAPS. To collect the data on students. I know this is frowned upon by QSA. However, I find marks are the best way to give student feedback on progress and also give appropriate “weighting” to various questions. In addition, it automatically weights the various assessment items.

Most schools use the following format each semester:

Mid semester test
A EMPS (ie assignment)
End Semester test

Students are awarded a standard in each of the 3 criteria (KAPS, MAPS and CAJ) on each item. If one “averages” these standards there is an implicit equal weighting to each of the 3 items. However, in the vast majority of cases, the EMPS is on a “smaller amount of work” As a rule of thumb, my tests are 40% each and the EMPS is 20% of the total marks.

I am happy to retain the EMPS.

Talking to the science teachers in the staffroom, YES the EEI’s are very large and there is an expectation on size and they would certainly appreciate an expectation of a shorter/smaller EEI’s. However, they are keen on retaining same. Many or most would be happy to use marks.

Biology’s Situation

I am not sure if you are aware of Biology’s situation so I would like to fill you in:

1. Back in the early 2000′s, after Chem and Physics told the QSA that they were not happy with the new syllabii during the original trial, Biology’s syllabus was forced upon them with no trial year. This was done by manipulating the definition of a minor/ major change. And, the changes were major (everything was changed including criteria, assessment etc. except the basic content). Biology teachers had many meetings with the QSA and we were basically told to ‘get on with it and make it work, as that was our job’.

2. The original syllabus (2004) was so unworkable that in 2006 they had to put out an amendment to it which was nearly as big as the syllabus itself. A review is long overdue, but has been now put on hold until the National Curriculum comes in. There are still major flaws, e.g. lack of cohesion between the exit statements and the objectives

3. Criteria – we also have three criteria in Biology, but unlike the other sciences and senior maths our criteria are all separate. I believe that for the other subjects, the third criteria is part of the other two. In Biology it is called ‘Evaluating Biological Issues’, which in itself has many processes that we had to teach our students. This has compounded our workload, as you can imagine. The QSA has only just started to come up with any examples to guide us with this area in the last year or so.

4. In 2005 a group of coordinators put together a CD to help teachers who were struggling. The QSA ended up being part of this because they had then realised what was happening. This CD contained only examples of assessment, and were not exemplars, as we were still trying to get our heads around the criteria.

5. Assessment – multiplied to the nth degree. The QSA expects no multiple choice (which I have just ignored), expects criteria sheets and no marks (I feel you can put the two together to achieve a standard, and this is how I have worked it), expects questions to be ‘open ended’ so that you have to award standards according to ‘how far’ a student can answer the question – discriminates against SA and low HA. Our ability to discriminate clearly between students is more difficult. I have rejected these, BUT I could do this as I have been teaching for a long time and felt confident in my approach. The huge worry has been the newer teachers in smaller country schools with no resources. Marking is horrendous. At least with Maths, Chem and Physics there is an analytical type approach to the questions. With Biology we don’t even have this option. The ERs and EEIs – for Biology the ER is compulsory. This is a major increase in workload for students, teachers and lab technicians. Because we have had to restructure our teaching around and through these pieces of assessment, the time we have available to spend on concepts has been reduced. However, the trial and error that has occurred over the last 8 years is incredible, with little support from the QSA – even at different workshops we were told different things…….

6. Workload overall is massive. I know many excellent Biology teachers who now refuse to teach the subject because of this. As for our students – I admire them for their effort. This must affect numbers in the Senior Sciences overall. In fact, the QSA in 2004 actually said that they expected Biology numbers to decrease………………….

I am sure there is more I could add but will leave it here.

Ironic Shame

I would like very much to attend the meeting on June 16th, however I am currently overwhelmed by the demands of marking senior Chemistry classes assessment (EEIs and criteria based examinations).

I have also been advised by colleagues not to attend on the grounds it may later effect my employment prospects (though I am not sure how much I believe that to be true). I just wanted you and your colleagues to know that if the number of teachers at your meeting is in any way disappointing, it is likely due to one of these reasons. The majority of teachers in my staffroom welcome this discussion, but will probably also be too busy with the demands of criteria based marking.

It is truly, a very ironic shame that many teachers will be absent and miss the opportunity to share their ample opinions on the topic.

Good luck with the discussion.

Stress and Distress of Students

From my perspective as both a parent and a professional who has worked closely with young people I am very supportive of any iniative which challenges the current system.

Personally I have three daughters who have completed year 12 and one daughter who is currently in Yr 12.

Professionally I have lost count of the number of young people in year 11 & 12 who have expressed their stress and indeed distress at the seemingly ever increasing workload including the vast array of assessment pieces.

My eldest daughter is currently studying at a Masters level and I would venture to say that she finds the workload more realistic than my daughter in Year 12.

It saddens me to see the effect that this stress is having not only in this developmentally significant  phase of their lives but potentially the ongoing and sometimes insidious effects which may go unnoticed until they raise their head as fully blown mental health issues.

I do not believe that this is alarmist but rather my observations of what I have seen first hand and what has been shared with me by young people themselves.

Our young people are a precious and unique asset both within our families and our community. It behoves us to tread carefully and rather then set them up for potential burnout to instead skill them with an intellectual curiosity and emotional resilience which sets them on a pathway to become both enthusiastic life long learners and compassionate, wise members of our global community.

I recently had a conversation with a teacher who stated that she no longer had time to have a relationship with her students. This struck a deep chord at the time and I wondered what this may actually mean. There seemed to me to be a great sense of loss in this statement. Indeed something to reflect upon.

Jennifer McMahon
B Health and Community Service, Major in Counselling

Concerns on Assessment

My major concerns on assessment centre around the following:

1. Despite stated word limits for tasks, the limits refer only to a relatively small proportion of the total assignment – this is particularly true in the EEI

2. The ability to communicate clearly and precisely underpins success instead of the concepts, principles and applications taking primacy. Ie, for two students with the same chemistry ability, the one with the greater facility with communication will be assessed at a higher level.

3. Criterion-based assessment is still a blunt-edged tool. Despite increased consistency among panellists, differing interpretations still exist on the features and characteristics of work required to meet specific standards.

4. The EEI is a flawed piece of assessment – it is not possible, within the time and resource constraints imposed, to design and conduct an investigation of the sophistication originally intended by the syllabus documents.

5. The learning is the assessment. With the extended tasks, students have no time to incorporate complex concepts into their cognitive framework before they are assessed on them. The time for reflection and practising until processes are ingrained has gone.

6. The work load on both teachers and students is overwhelming. I know, even with my experience, that it takes approximately one hour to assess to best of my ability a single student’s EEI.

All of this is not to suggest that our students do not have valuable skill sets when they exit secondary school; it purely that the skill set is different. Knowledge of details and the ability to undertake routine applications has been diminished, but the ability to analyse and evaluate have been enhanced.