Proper Assessment: It’s in the Principles!

Proper Assessment: It’s in the Principles!

Proper Assessment: It’s in the Principles!

I was going to stay out of the debacle, but many people have been asking my opinion about the CXC examinations results for 2020. I’m not here to be polemic, but rather pose some questions which can be used for reflection and review. After all, that’s the stage we’re at as educators, right?

I’ve written about Assessment before but it seems to be a topic that just never gets old. We’re always trying to do it better and that’s great! Yet no matter how creative we get, whether we assess online, traditionally, or during a pandemic…..Assessment should always be governed by Principles.

With the 2020 CXC exams, the Validity and Reliability principles seem to be in question. Therefore, in my humble opinion, review committees should start there. A critical and detailed analysis of the Multiple Choice Exams, is definitely in order. Some may argue that this should have happened BEFORE the exams were written by candidates (I’d like to think it did), but regardless, we need to put emotions aside and be objective. After all, we are reflective practitioners.

So let’s ask some reflective questions about the exams and trust that CXC is doing the same.

  1. What does the Multiple Choice Item Analysis tell us? : Item analysis allows examiners to determine which items students found difficult/ easy. From there, they can ask questions like: Was it a problem with the question itself? Did its phrasing cause a lack of clarity? Were the answer choices too close? Honestly, until my Dip. Ed. program, I didn’t realise how many factors you need to consider when writing multiple choice questions. Something as simple as putting the choices in an alphabetical order is often overlooked and can cause difficulty for a candidate.

Exam reviewers should also consider if “a sufficient number and variety of questions were used to determine candidate mastery of topics tested.” Were all areas tested equally? Or were some topics given greater weighting? Also, what were the thinking skills needed to answer the questions? Were there too many knowledge questions and perhaps not enough that demanded application?

Ideally, much of this analysis should happen before the administration of the exam. But if I’m part of a review committee, this is where I would start.

2. Should Paper 2 have been omitted? : I must say, I groaned when I heard that MFLs would still have a Paper 2. Now, I’m kinda glad that they did. The benefit of long/ structured answers in an exam, is that students get to explain their point, probably earning themselves some marks in the process. Basically, they get to express themselves and their train of thought. They may not remember everything on a given topic, but they can write what they do. That gives them more of a fighting chance. Even if they stray off course, they might at least get 3 out of the 5 marks. And that…adds…up. With Multiple Choice, there’s no grey area, no half-marks….students are either right or wrong.

I’m not a huge fan of that, because I think it’s more important for students to understand and be able to explain why something’s right or wrong. You need to assess their higher level thinking skills. With students moving into higher education, can they analyse/ evaluate a problem and create a solution? Did the designed exams and relevant SBA components properly assess those levels of cognitive learning?

3. Where could Error (internal and external) have occurred?: We don’t like to think about it, but it’s inevitable when we administer exams. As J.H McMillan (2011) states, ” in the end, you get an observed score that is made up of the actual or true performance plus some degree of error.” (p. 76). As educators, we know that reducing error increases reliability.

So, what is error?

External error: unclear statement of instructions, item ambiguity (as afore mentioned), uncomfortable exam room conditions, interruptions to exams, exam sampling, scoring bias… (It can be a Pandora’s box once opened, but while we may not be able to eliminate all error…we can work to minimise it.)

So my next step in the review process, would be to look at data from examiners’/ invigilators’/ moderators’ reports. What were the irregularities? What were the dominant problem areas? Were there significant disruptions on the days of the exams?

Internal error: Candidate health, mood, motivation, anxiety, fatigue, test-taking skills and general ability to sit the exam.

I think we can all agree a significant degree of internal error might have occurred in these exams. There is no doubt, that students were placed under a lot of stress in the lead-up to their exams. They worked through it, persevered and honestly, they are the true superstars of this whole exam process.

I think that’s why I decided to write this post, because my heart really does go out to them now. They are the ones who stand to be affected the most and it’s their future at stake.

Which leads me to question 4: What might be the examination body’s responsibility to the stakeholders of these exams? With regional governments up-in-arms and both parents and students demanding an explanation, this exam review needs to be done with wholehearted commitment, objectivity and transparency.

In a media release on March 26th, CXC announced a strategy which included the employment of an:

“e-Testing modality (online and offline) in order to reduce the examinations administration processing time resulting in the shortest turn-around time for marking and the release of examination results. In addition, it will provide an opportunity for the timely presentation of grades to facilitate matriculation to higher education or to access employment. This also minimizes disruption to the 2020/2021 academic year.

https://www.cxc.org/may-june-strategy-2020/

Was this modality effective? Did the fast turn-around affect the marking process? Should the minimisation of disruption to the 2020/2021 academic year been a priority? And if yes, was it well managed?

In another May 15th 2020 press release, it is stated that:

“During grading, CXC’s Quality Assurance Process will apply the appropriate weighting to ensure that candidates are treated fairly and in an unbiased manner.”

https://www.cxc.org/cxc-statement-july-2020/

Since this was done, generating a report on the process should not be difficult. In fact, I’m pretty sure it would have been done already. At this point, all supporting data is critical.

And now to my last question,

5. Is a return to an International Examination body the answer?

Honestly, let’s not jump the gun. We need to make informed decisions in everything we do. And I do believe that regionally, we may have an internal bias that what is foreign in better (c’mon, you sometimes reach for the Pringles before the Sunshine Snacks too). Personally, I believe we have amazing talent and expertise which we can tap close to home, if not, at home itself. ‘We home’ anyways, right Kes? (lol)

Bear in mind also, that context is important to most of life, even exams. Do we really want to give ours up or simply push it into second place? Or should we just work on improving what exists? Personally, my vision involves building a stronger region, committed to consistent improvement in education and assessment. We have what it takes to do it and once we tap the right resources, we can make it happen.

Some people may not agree with me and that’s fine. Some people may also have a lot of other questions they want answered and that’s also fine. All I really hope you understand from this post is this:

Proper Assessment is governed by Principles. Start your inquiry/review process from there, and you’ll get the most useful answers.”

Comments ( 2 )

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Top