Thursday, December 26, 2019

How many answer choices is best for a multiple-choice question? Probably not what you think.

Last week I discussed a quiz I developed  to help people analyze what they know and don't know about developing valuable and valid multiple-choice questions (mcqs). The mcq quiz: https://forms.gle/FMcuPywmKFLJ69je8.

Last week I shared the answer to one of the questions: Is “describe how to” a good behavior/action to use in a workplace learning objective? The answer is no and I explained why.

This week I'm discussing another question on the quiz. Here's the question.
What is the optimal number of answer choices for a multiple-choice question, according to research? (select the best answer) 
  • Three 
  • Four
  • Five 
The image below shows the pattern of replies from the 110 people who answered (so far). The green bar indicates the correct answer. The other two are incorrect.



Most people think MORE answers is better. That's because more answers choices appears to lower the chance of guessing correctly. 


In other words, four answer choices appears to have a 25% chance of guessing correctly. Five answers choices appears to have a 20% chance of guessing correctly. 

But the number of answer choices is not the only thing that determines the chance of guessing correctly. Quality of answer choices makes all of the difference.

Research shows that most people have a very hard time writing good multiple-choice questions. They have a hard time writing good answer choices. Mcqs with four or five answer choices often perform poorly because the answer choices are poorly written and make it easy to guess which are incorrect.

So research tells us to do the following when writing answer choices.

  1.  Learn how to write good wrong and right answer choices!
  2. Write two GOOD incorrect answer choices and one GOOD correct answer choice.
If you write mcqs, you cannot afford to have poorly written answer choices in your multiple-choice questions as they can damage assessments, people, and the organization.

Added:
Thanks to Anand Chandarana for reminding me to give others a link to an in-depth review of the research on the issue of number of answer choices.

Rodriguez, M. C. (2005). Three options are optimal for multiple‐choice items: A meta‐analysis of 80 years of research. Educational Measurement Issues and Practice, 24(2), 3 – 13.  http://www.highpoint.edu/citl/files/2017/06/Three_Options_Are_Optimal_for_MCQ_Rodriguez_2005.pdf. 
-----------

If you need to learn the secrets to designing valid mcq assessments, take my upcoming Write Learning Assessments course http://bit.ly/dlaw-wla. I'll teach you how to write GOOD correct and incorrect answer choices because these are essential to good mcqs and good mcq assessments. 


My take: If you design mcqs, you need to know how to do this difficult skill well. I'd love to teach you how.

Monday, December 16, 2019

"Describe how to..." is (usually) an inadequate learning objective

I developed a quiz (https://forms.gle/FMcuPywmKFLJ69je8) to help people analyze what they know and don't know about developing valuable and valid multiple-choice questions (mcqs). One of the questions:

Is “Describe how to” a good behavior/action to use in a workplace learning objective? (select the best answer)
  • “Describe how to” is a measurable behavior/action so it is appropriate to use in a workplace learning objective.
  • “Describe how to” is rarely part of tasks so this behavior/action is likely written at too low of a level.
  • “Describe how to” is often part of tasks so this behavior/action is appropriate to use in a workplace learning objective.
The image below shows the pattern of replies from the first 54 people to answer. The green bar indicates the correct answer. The other two are incorrect.



Although "describe how to..." is a measurable behavior, it typically inadequate in that it is NOT what we want people to be able to do. So asking people to describe, especially in a workplace learning setting, is not usually what we want.

Let's say we're teaching people how to sum a column of numbers in a worksheet using MS Excel. They are not learning how to "describe" how to do it. They are learning how to sum a column of numbers, using the sum function.


In other words, asking people to describe is at a lower level than actual, needed performance. So while it's measurable, it's not really enough.

I teach people to write learning objectives (LOs) that describe actual performance including how achievement is measured. Actual performance-based LOs make writing meaningful mcqs MUCH easier.



---------------------------

If you need to learn the secrets to designing meaningful and valid mcq assessments, consider taking my upcoming Write Learning Assessments course http://bit.ly/dlaw-wla. I'll teach you how to write performance based LOs in the course as well because they are the foundation of good mcqs and the right course content. 


My take: If you design mcqs, you need to know how to do this difficult skill well. I'd love to teach you how.












Thursday, June 13, 2019

Why multiple-choice questions are (too often) problematic

Research shows that too many multiple-choice questions are written poorly and therefore create bad assessments. A few of the common issues with multiple-choice questions are, according to research, is that they too often
  • are unclear or otherwise poorly written.
  • are too easy to guess.
  • only test recall of content.
  • don't measure what they intend to measure.
  • become a test of something other than whether the test taker knows the content.
What's wrong with the following multiple-choice question?

Which of the following is not a good way to put out a grease fire in a pan on the stove? (Select the best answer.)
  1. Smother the fire with a metal lid.
  2. Smother the fire with water. 
  3. Smother the fire with baking soda or salt.
  4. Smother the fire with Class B dry chemical fire extinguisher contents.
The correct answer is 2. But research shows that many people who know the answer will get the question wrong anyway. That's because negatively-worded questions are harder to understand and easier to mess up. Answers 1, 3, and 4 are acceptable ways to put out a grease fire in a pan on the stove. And since they are correct, people are likely to select them and get the question wrong.
These and other problems (such as the problems in the list above) lead to invalid questions and assessments. Validity is the most important criteria for a good test. Validity refers to whether the test measures what it claims to measure. If it doesn't measure what it claims to measure, the test answers provide little (or inaccurate) information about what people know or can do. Those tests waste time and resources. If the test is used to make decisions (proceed to next course, prove competence, etc.), poorly written tests are a legal battle waiting to happen.

To make assessments more valid, there must be a very clear match between learning objectives and assessment items. Research shows that this is way too often not the case.

Instructional writing, as I discuss in my book, Write and Organize for Deeper Learning, is different than other kinds of writing. Writing multiple-choice questions is specialized instructional writing. Clarity and readability are critical. But here's something that multiple-choice questions also must do. They must be written so that participant's answers show who knows the content and who doesn't.

In the multiple-choice question at the beginning of this post, the negatively-worded question made it harder to understand. As a result, it was harder to answer correctly. Which makes the answer harder to interpret. If someone selects the wrong answer to the question at the beginning of this post, how sure are we that they didn't know the correct answer? We aren't sure.

Luckily, research also offers clear and actionable tactics for making questions clearer and a better match to learning objectives. I used to do multiple-choice questions writing workshops for companies and higher education staff development. I loved teaching them. But one day wasn't usually long enough to gain the needed skills and I feel a great need to help people gain real skills. 

So I decided to build my first hands-on skills course on assessments and writing multiple-choice questions. It's a critical skill and it's rarely taught. You can learn more (a LOT more) and register. Or ask me to deliver this course for your team. 

Can I ask you to do me a favor? Please tell others about my Write Learning Assessments course and send them the link (bit.ly/DLAW0001). I am building a set of instructional writing courses and this is the first. 

References
Chiavaroli, N. (2017). Negatively-worded multiple choice questions: An avoidable threat to validity, Practical Assessment, Research & Evaluation, 22(3), 1-14.

Haladyna, T. M., & Downing, S. M. (1989). A taxonomy of multiple-choice item-writing rules. Applied Measurement in Education, 2(1), 37-50.

Haladyna, T. M., & Downing, S. M. (1989). Validity of a taxonomy of multiple-choice item-writing rules. Applied Measurement in Education, 2(1), 51-78.

Hopkins, K.D. (1998). Educational and psychological measurement and evaluation. Needham Heights, MA: Allyn & Bacon. 

Marsh, E. J., Roediger, H. L., Bjork, R. A., & Bjork, E. L. (2007). The memorial consequences of multiple choice testing. Psychonomic Bulletin & Review, 14, 194-199.

Marsh, E. J. & Cantor, A. D. (2014). Chapter 02: Learning from the test: Dos and don’ts for using multiple-choice tests, in McDaniel, M. A., Frey, R. F., Fitzpatrick, S. M., & Roediger, H. L. (Eds.), Integrating Cognitive Science with Innovative Teaching in STEM Disciplines, Washington University, Saint Louis, Missouri.

Roediger, H. L., III, & Marsh, E. J. (2005). The positive and negative consequences of multiple-choice testing. Journal of Experimental Psychology: Learning, Memory, and Cognition, 31, 1155-1159.

Schuwirth, L. W. T. & van der Vleuten, C. P. M. (2004). Different written assessment methods: what can be said about their strengths and weaknesses? Medical Education, 38, 974–979.

Shrock, S. A. & Coscarelli, W. C. C. (1989). Criterion-referenced test development. Reading, MA: Addison-Wesley.



Friday, May 24, 2019

Non-Conscious Aspects Of Learning And Performance

Being on autopilot has a lot of implications for learning and performance. Recently, Guy Wallace (@guywwallace on Twitter) posted about experts having difficulties figuring out what people must learn to perform a task. But experts often unintentionally leave things out. Their performance is highly automated so they no longer have conscious access to exactly what they are doing.
Automated and non-conscious prior knowledge is stored in long-term memory. An expert’s deep prior knowledge makes them far more capable of solving difficult problems in their area of expertise. But because it’s automated and non-conscious, they’re often unaware of exactly what they are doing.
Guy pointed me to Richard Clark’s article, The Impact of Non-Conscious Knowledge on Educational Technology Research and Design. And this article turned out to be a goldmine of important information. Experts, research finds, tend to be conscious of the physical actions they take, as well as the knowledge they use. But they are much more unaware of the mental activities used to perform tasks and solve problems.

Monday, May 20, 2019

Should We Use Background Music With Instruction? No.

The general rationale for not using background music is that it increases harmful cognitive load. Cognitive load relates to mental processes (like perception, thinking, and organizing) used for thinking, learning, and working. Working memory needs to process new information but it has considerable constraints (in capacity for new material and holding time). John Sweller, a well-known researcher and writer on memory and cognitive load and other aspects of learning, reminds us we must design with how our mental processes work. If we don’t, people can’t learn. And learning quickly is a mandate for current organizational conditions.
There are two types of cognitive load: helpful and harmful. We call the harmful type extraneous cognitive load and, when we don’t reduce this type of cognitive load, we make it harder to learn. Here are some examples of extraneous (harmful) cognitive load:
  • Too much content
  • Decorative and irrelevant graphics
  • Unnecessary explanations
  • Unnecessary media
Stop reading for a moment and think about why these items cause harmful cognitive load, given what I told you about working memory (Really! Try to answer the question before going ahead). Then look at my answer below.
Read the entire article on eLearning Industry.

Microlearning, Macrolearning. What Does Research Tell Us?


In the last year I have increasingly hear L&D practitioners talk about microlearning like it’s “the answer.” What is it the answer to, exactly? The response: Nearly everything. But knowing that we must create learning experiences that fit specific needs, I felt doubtful. Still, until I understand what the preponderance of research says, my opinion is just a guess based on what I already know. As a result, I set out to learn more and this article sums up what I learned.
What does research say about microlearning? In this article, I’ll offer some definitions of microlearning that offer clues about important aspects and explain what research and researchers have to say about microlearning. I’ll compare what people say are the benefits of microlearning against what we know from research. And I’ll discuss what micro and macro approaches offer workplace learning and how we might use each.
I can sum up much of this article with a specific insight from Professor Christian Glahn at the Hochschule für Technik und Wirtschaft, who studies learning and work:
Microlearning is not the solution to all workplace learning needs.
Read the entire article on eLearning Industry.

How Well Do We Learn From Experiential Or Inquiry Learning Approaches?

Direct instruction directly teaches the content. People are supplied with content and activities that help them build needed background knowledge. And we make sure that what they know is correct and usable. Indirect approaches use experiential or inquiry methods that prompt discovery of needed information and often simulate and test performance.
Training people to identify hazardous materials in the workplace, for example, would likely have lessons, labs, and tests in a direct approach. In an experiential approach, people would likely work through scenarios or case studies.
Paulo Freire, a learning theorist, disapproves of what he calls the “banking model of education,” where teachers (or trainers or instructors) deposit information into students’ heads. Learning sciences clearly shows that we cannot directly fill people up with knowledge (my new book, Manage Memory for Learning explains how we do learn). People do not “record” what they learn during instruction for playback during application.
Read the entire article on eLearning Industry.

How many answer choices is best for a multiple-choice question? Probably not what you think.

Last week I discussed a quiz I developed   to help people analyze what they know and don't know about developing valuable and valid mult...