Log in

Login to access our Dynamic OTS modules. First time user? Get in touch today for access.

Ian Glover talks about measuring the impact of digital learning

Dynamic. Ian Glover.

Ian Glover talks about measuring the impact of digital learning

20 September 2021

Measuring the impact of digital learning: 12 rules for writing assessment tests

Writing effective assessment tests can really help in measuring the impact of digital learning. Unquestionably, they can make the difference between a successful digital learning solution and an unsuccessful one.

Assessment tests and quizzes can give you the reassurance that a learner understands your elearning or digital learning solution. Moreover, they can also help them to remember and understand what they’ve learnt.

I’ve worked in elearning for over 20 years and I’ve seen all sorts of online assessment tests, from really good ones, to ineffectual. Likewise, over the years, I’ve learned some of the pitfalls to avoid. And in doing so, learned strategies tips to really make assessment a success, which really help in measuring the impact of your digital learning. So if you’ve ever asked yourself, ‘How do I measure the impact of digital learning?’, here’s just one of the ways, broken down into a dozen golden rules …

12 GOLDEN RULES FOR WRITING

ASSESSMENT TESTS

IN ELEARNING

1.

Test understanding in digital learning, not just memory recall.

Firstly, this is so important and it has to be my number one tip. Memorised facts fade really quickly, but if someone understands and can apply what they know, they’ll remember it emphatically, for much longer. You can go beyond a simple recalling of knowledge question by asking learners to interpret facts, evaluate situations, explain cause and effect, make inferences, and predict results.

2.

Use simple and precise wording.

Measuring the impact of your digital learning is not just a case of acting post-assessment. When you start out writing assessment tests, use simple sentences, whenever possible. And try to be as precise as you can in your choice of words because in the English language, one word can have several meanings within different contexts.

3.

Be consistent.

Keep to specific question formats and keep the number of answer options consistent. This helps users know what to expect. And furthermore, they can concentrate their efforts on the subject matter, rather than having to check the question format each time. 

4.

Make your instructions clear.

Use clear learner instructions, such as ‘Choose all the answers that apply’, or ‘Enter just the numbers in the box below without any additional text’.

5.

Place most of the words in the question stem.

Where possible, make multiple-choice answers as short as possible by putting most of the information in the question stem.

6.

Make all distracters plausible.

In multiple-choice questions, make all distractors (wrong answers) plausible. Consider using three or four distractors along with the correct answer. This provides sufficient scope to have two or three plausible distractors and one extremely plausible distractor, meaning that you’re really testing the user’s understanding of the subject.

7.

Keep all the answer choices the same length.

Try and keep all the answer options the same length. While this will not always be possible, it’s often the case that in poorly written assessment questions, the longest answer is often the correct answer!

8.

Mix up the order of the correct answers.

Ensure that you mix up the order of the multiple-choice answers. The dynamicLMS Moodle Quiz and H5P functions have settings to automatically shuffle the order of multiple-choice answers – you then don’t even need to worry about this – let the system take the strain.

9.

Don't use trick questions.

Do not, under any circumstances, use trick questions. If a question can be interpreted in two ways, or if the difference between answer options is subjective, then rewrite the question.

Also, do not use double negatives, these are likely to confuse the user which is both unfair and undesirable – we want users to be able to demonstrate their understanding of a subject, not their mastery of untangling cryptic questions.

10.

Use question types appropriately.

For example, a medicine’s calculation question would be far better (and more closely related to actual practice) if the user has to enter a numerical answer. In thgis case they would select the units of measurement, rather than being given four possible options.

11.

Use 'All of the above' and 'None of the above' with caution.

In my experience, in poorly written test questions, these are nearly always the correct answer – and lots of people know this too!

12.

Don't forget about feedback.

Do you want to just give a total score at the end and a pass/fail? Or do you want to show the user where they went wrong? Do you want the user to learn from their incorrect answers? If you do, you’ll need to give specific feedback for every wrong answer and explain why it wasn’t the best answer to have selected.

Interested in measuring the impact of digital learning? Get started with writing great assessment tests.

If you’re interested in measuring the impact of digital learning, but still not sure how to tackle your assessment, or it’s not quite hitting the mark, get in touch today. We have a team of experienced scriptwriters who can turn your ideas into a successful learning assessment.