Lessons Learned from Writing a National Certification Exam
Applying large-scale assessment ideas to small classes
About five years ago, I helped to score one single exam question for the American Society for Biochemistry and Molecular Biology (ASBMB) Certification exam. Little did I know the effect it would have on my thinking and the impact it would have on my own test question writing.
Briefly about the exam itself: the ASBMB has developed a national certification exam (https://www.asbmb.org/education/certification-exam). It is a one-hour, 12 or 13 question exam and is written and scored by faculty volunteers from across the country. Students who pass the exam at the highly proficient level receive an ASBMB certification with their Bachelor’s degree.
I wanted to share a few things that I learned from writing questions, writing rubrics, and helping to score this exam. These lessons ensure fairness to the student, promote consistency in grading, and bring deserved attention to question and rubric details.
Lesson 1: The person or committee that writes an exam question ideally should not be the ones to write the rubric or score the answers. This is because when we write the question, we are already biased towards an answer.
Lesson 2: The exam should be anonymous when graded, especially if you know the student.
Lesson 3: Exams should be scored by more than one scorer and inter-rater reliability should be calculated and addressed.
Lesson 4: Questions should be reevaluated and improved after each time the test is administered.
Lesson 5: The same question should be graded from all exams at the same time.
Lesson 6: Questions should target different levels of Bloom's taxonomy to create an exam that can also be a useful teaching diagnostic tool. Since we should aspire to teach students to analyze, evaluate, create, etc., categorizing questions in this manner will diagnose how well we are teaching these skills.
The ASBMB last year scored almost 1,000 exams so it is a large-scale process, but we can all apply these ideas on a small scale here at Touro College, especially in the undergraduate divisions. For example, a colleague can grade exams of three students in your class to check if inter-rater reliability is an issue. Instead of asking a peer to take one of your exams, ask them to take a five-question quiz containing the five new questions you are vetting for this year's final. Also, another professor teaching the same subject could help you with a rubric to make sure it is objective, not overly biased towards what you remember saying in lecture.
Written assessments are difficult to develop, and we shouldn't do it alone.