Saturday, December 12, 2009

Blooper Reel

Just like those movie reels where you see funny mistakes or gags that occur when the actors mess things up in filming the movie, the BSDCG got to see a blooper of our own recently.

After receiving his exam results, one of the examinees noticed that the scores listed on the exam results PDF file seemed a bit odd. He contacted us through the website and mentioned that he thought the results were incorrect. Turns out- he was right.


The figure above is from the incorrect exam results PDF that was sent out to our sharp eyed exam candidate. The figure shows, from left to right, the Knowledge Domains on the exam, the candidate score for each Knowledge Domain, and the Group Minimum, Average, and Maximum sub scores for each Domain. The "Group" scores represent the scores for all recent attendees for the exam. It shows the Group Minimum score, and the Average and Maximum scores.

Notice anything unusual? Check the Group Maximum score for "Basic System Administration". It's listed at "17%". Now ask yourself- "Is it really true that the highest score for Basic System Administration could really be just 17%??" especially for an exam that is over one year old?

Turns out- it wasn't. Below are the corrected scores for the same candidate.

As you can see, the Group scores (Minimum, Average, and Maximum) appear closer to what you would expect in the real world. That "17%" for a Group Maximum regarding "Basic System Administration" is really "92%"- reflecting the fact that the group scores are significantly better than what was previously posted.

So.... what went wrong??


Like many system failures, this one was more than just a single mistake. One factor was that we've recently instituted a new exam "form" (separate set of test questions) and our exam scorer decided to roll the results into a new file. However, it's not just one file that has to get updated- there are several files and we missed one.

At the same time, our reviewer who mails out the exam PDFs (uh, that's me) didn't carefully review this set. Actually, I did notice that the scores were strange, but assumed it was the result of the form change.

Once we were alerted to the issue, we rechecked and corrected the procedures we use and reran the exam scoring for everyone who took that set of exams. It turns out that the Pass/Fail status for the exams were not affected- just the individual and Group scores.

For us, this highlights again how important community involvement and feedback is to the success of the exam. In this case, we found the mistake and corrected it immediately. And I'm sure glad we did. It brings the importance of reviews back into focus. You can bet we will all be reviewing all exams in the future much more carefully!

No comments: