Concerns of a Physics Teacher

My concerns with the new syllabus are as follows:

  •   A syllabus document should be general enough and easy to implement whether at a large school in the city with plenty of collegiate support or at a small school in the country with a single specialist teacher and little collegiate support. Even though I have 15 years of teaching experience and 5 years of industry experience prior to that, my training has not prepared me for the large amount of English content that I now have to wade through. Most of us studied physics because we found English difficult as it is was not our forte and we had a more mathematical bent. This syllabus has virtually made me feel as if I am a novice all over again. Genuine novices in country postings would be finding the going very rough at present but it would be good if they knew about this site as there appears to be few inexperienced teachers viewing this site.
  •   A resource bank needs to be created before the general implementation of a new syllabus. Since the assessment developed for the trial pilot was deemed no longer suitable, two years is too long to wait for such a resource for the 2007 syllabus. The need for a resource bank is a very high priority.
  •   From what I understand, Physics at university level is still very much the same as it was when I studied 25 years ago. If so, what are we preparing students for if they are to be sitting skills-based exams and structured pracs in their first two years at University? If we thought we were protecting the poor students’ brains from an overloading of content by covering less content in the new syllabus, they are in for a bigger shock if and when they do decide to take on a physics-based subject at University level. My work program does not cover optics for instance and so I would doubt that I will get many students going on to optometry because they have not been exposed to the content. The fact that there were many students who coped very well with the old syllabus, and enjoyed the study, was testament that we were doing some things right. The Physics syllabus should be geared for students studying the technical sciences at university level. To do anything else is a disservice to those students who wish to aim high. The extended writing skills that are being pushed now by the 2007 syllabus can be more than adequately covered in their English studies.
  •   In my part of the world there were few complaints from those who still remained on the 1995 syllabus. I would like to know where the QSA collected their data from or was it just a case of those who shouted the loudest? Were the same people still voicing their concerns in 2007? The 1995 syllabus still allowed me to get the students to complete an extended investigation on an experiment of their own design. It was successful because they had the necessary theory and practical skills behind them to complete it with some degree of challenge and complexity.
  •   The inability to use averaging and the necessity to scan every written word for ‘evidence’ for a LOA has made panel meetings a bit of a rubber stamp exercise. There is no way that an average physics panelist (with a brain geared for mathematics) can have enough time to scan 5 entire folios for the necessary evidence within the allotted time. In the absence of finding the necessary evidence, we must either accept the judgment of the teacher because they said so or spend inordinate amounts of time looking for something which possibly is not there. As a consequence, our panel meetings have been ending when the school cleaners are about to lock up the school!
  •   As Alan Whyborn said about the ‘shifty’ nature of the assessment criteria. It may be clear to some who have a greater command of the English language than I but I find that I can read the same piece of writing over again and give it totally different grades depending on how much emphasis I give to each descriptor within the exit criteria. It is too subjective and open to many interpretations. This leads to an inordinate amount of time spent justifying to parents and students when there are much more efficient ways that are just as effective at setting levels of achievement.

I would like to suggest a way forward where I feel that too much emphasis is spent evaluating IP and EC. They are necessary, but I feel that we are unnecessarily inventing ways to assess these two criteria across the entire course. It is hard for students to critically analyse if they do not have a solid grounding in knowledge, but since knowledge is now just one third of what we do, any attempt at critical analysis seems to fall short due to this lack of knowledge. Wouldn’t it be better to have a Knowledge and Skills focus for most of the year, culminating in one final experiment of their own choosing? This way, we are not devaluing the skills base of our current physics teachers and we are expanding them into this new area of self-directed learning more gradually. Maybe in the future we can assess IP and EC across the entire course but the reason for the large amount of dissent at present is because it is too much too soon.

This is a following letter to further explain Item #6. I used the IP2 and IP3 comments as an example. There are others.

It is the interpretation of the student’s work in terms of the exit criteria that appears to be a large hurdle. In isolation, the glossary is sufficient to explain the terms but understanding what they mean in terms of the wide variation of submitted work is another matter.

I will try to explain the ‘shifty’ nature of the exit standards.

Take for instance the comments for IP2 across the A, B and C levels. The first part of the comment, ‘assessment of risk’ appears for the A, B and the C responses. So if a student appears to have ‘assessed the risk’ in an assessment, do I give them an A, B or a C for this part of the assessment?

The second part of the comment, “safe selection of equipment”, appears in the B and C columns. If a student has selected their equipment safely, do I give them a B or a C? If they have modified the equipment in even the most simple way, does that qualify them for an A standard here?

The third part of the comment which says, “appropriate application of technology to gather, record and process valid data”, where the word process is the B discriminator and valid is the A discriminator, is the only part of the comment where there is clear discrimination between the levels. What is unclear however, is how to interpret the words process and valid and what they mean in terms of the student’s submitted work. Also, if a student has obtained valid data but is unable to process it in a significant way, does it fit under the A, B or C descriptors?

What if a student has been unsafe and unwittingly exposed to some risk but has processed valid data? How do I give an overall achievement (on balance) when the student’s work displays both unsafe work practices and high analytical skills? Do I give more consideration towards safety or analytical analysis and does it depend on which side of the bed I got up from in the morning?

When I look at IP3 I need to be able to discriminate between obvious patterns, trends, errors and anomalies and less obvious ones. My opinion on what is obvious and what is not, will differ from other teachers and students and is highly debatable. It is this lack of clarification that makes the standards associated with the exit criteria difficult to use.