Computerized Assessment


Complete an exam or quiz for your LMS course site, reflecting a variety of question types and assessment strategies, and built within Moodle’s quiz tool. Your quiz must have 10 questions. You will be assessed based on creating:

●  3  multiple choice questions
●  3 matching questions
●  2 short answer questions
●  2 short essay questions
●  One question with an embedded image or graphic
●  Partially or wholly auto-assessed/graded
●  Time limited
●  Pre-programmed post-exam feedback for students
●  A reflection upon your experience completing this assignment


Coming from a background of training, assessment through online quizzes or exams is an area where I lack practice. I enjoyed the opportunity to explore assessment tools in Moodle. For this activity I chose create a review & practice exam situated in week three of a post-secondary intensive (6 credit) industrial/product design studio. Part of the course description requirements are that students ‘integrate all aspects of design practice’ which includes the application of ergonomic principles. Applied ergonomics is one of many prerequisite courses students would have taken previous to this class, and will be expected to apply as a significant portion of their final mark in the current course. Since this involves complicated situation-specific application of skill, students generally need to review and practice with theoretical situations until they are able to reliably integrate these skills on their own.

“Based on the objectives you have written, develop assessments that are parallel to and measure the learners’ ability to perform what you described in the objectives.” (Dick & Carey, 1990)

My practice exam is meant to provide a diagnostic formative assessment (Jenkins, 2004) as well as an opportunity for skills practice to support the fulfillment of the learning objectives.


Practice exam


According to Gibbs & Simpson, “the most powerful single influence [with regards to student achievement] is feedback” (2004) which should be “timely in that it is received by students while it still matters to them and in time for them to pay attention to further learning or receive further assistance” (2004). Since the pace of the course is rigorous and hinged on effective time-management, I set question behavior to allow for “immediate feedback.” All but two of the 10 questions are auto-marked which helps reduce marking load on teachers.


Question Behaviour


The short assessment on my demo site: click here  (log in as: Visitor, Password123!)


On the introduction page I explained to students that the activity is intended for them to ‘brush up’ on their ergonomics skills, and that they must complete the practice exam once before next class. This helps  evaluate the initial level of these skills. I also explained that students would be permitted to take a variant of the exam again (up to three times in total) and that the best score out of three would constitute their mark. This way students can use the initial trials to self-assess and to work on improving their skills thereafter.


Wherever possible I chose to guide “the choice of further instructional or learning activities to increase mastery” (Gibbs & Simpson, 2005); I made feedback show immediately after answer completion and rather than giving students correct answers, I provided links to review the relevant materials along with information regarding the importance of particular topics in relation to their final deliverables:


Match question

Overall I used the various aspects of the exam to: assess prior knowledge/skills; guide towards materials/exercises that increase these skills; have them logically identify/sort phases of design, research, and ergonomic measurements for correct application; and to practice the application of these with theoretical design scenarios. I saved the essay questions for the last, as “competent students should be able to provide coherent explanations; generate plans for problem solution; implement solution strategies; and monitor and adjust their activities” (Anderson, 2008).




With Conditions Under Which Assessment Supports Students’ Learning (Gibbs & Simpson’s, 2005) in mind, the following are the conditions I attended to in the design of my assessment:


Condition 1 Allows students to spend sufficient time on a topic to learn it effectively (time-span of study & review over a few weeks). Time and effort are captured and frequency used to increase motivation (repetition of variant exams/exercises).
Condition 2 Students must allocate appropriate amounts of time to help distribute effort (multiple tries to incrementally increase score if needed).
Condition 3 Geared towards appropriate types of learning, engages students in problems that use the discourse at hand. Makes it clear that application of theory or discourse are assessed makes process part of the learning goal (direct application of relevant skills & objectives as in screenshot above).
Condition 4 Remedial feedback and speed of response may compensate for lack of individualization (partly automated & auto-marked with feedback)
Condition 5 Feedback provides options for action (specific review options/links and advice provided on incorrectly answered questions).
Condition 6 Emphasis on immediate feedback at each stage (direct after-question feedback)
Condition 7 Can be used to correct errors, develop understanding through explanations, generate more learning through further specific study tasks, promote metacognition through reflection, allows for self assessment.
condition 8 Repetition of assignment requires that students attend to feedback if they score poorly (see instructions page, first page and last page of quiz).
condition 9
condition 10 Feedback is acted upon by the student (in subsequent tries). Uses known skills and allows monitoring for improvement.

Overall, the process of creating the exam/quiz itself in Moodle was time consuming and unintuitive, but I now see the power of integrated assessment tools when they are carefully and thoughtfully applied. I really look forward to practicing these skills!

Anderson, T. (2008a). Towards a theory of online learning. In T. Anderson & F. Elloumi (Eds.), Theory and practice of online learning. Edmonton AB: Athabasca University.

Dick, W., & Carey, L. (1990). The systematic design of instruction. New York: Harper Collins. Chapter 1: Introduction to instructional design (pp. 2-11).

Gibbs, G., & Simpson, C., (2005). Conditions under which assessment supports students’ learning. Learning and Teaching in Higher Education, 1(1), 3-31. Retrieved here.

Jenkins, M., (2004). Unfulfilled Promise: formative assessment using computer-aided assessment. Learning and Teaching in Higher Education, 67-80. Retrieved here.

Image source: Plings. (2010). Plings_005. [Image file]. Retrieved from Flickr under CC by 2.0 license.

0 replies

Leave a Reply

Want to join the discussion?
Feel free to contribute!

Leave a Reply