Dr John Hedley, Senior Lecturer
School of Engineering
Faculty of Science, Agriculture and Engineering
What did you do?
Introduction to Instrumentation and Drive Systems (MEC3027 and an equivalent block version MEC8058) is a module where students undertake an assignment worth 60% and a computer-based exam worth 40%. The assignment consists of a design exercise and is submitted as a video presentation which is then assessed.
The assignment marking time is significant and the key issue to be addressed was to ensure fast and consistent marking with effective feedback to the large cohort of students. The assignment is setup in such a way that specific criteria is defined for the marking which then enables descriptive student feedback to be automatically generated.
Who is involved?
Dr John Hedley, Senior Lecturer Engineering
How did you do it?
Assignment marking, particularly when there may be variability in possible solutions such as a design exercise, is very subjective. This is acceptable if a single marker assesses all the work but when a group of markers are used, there is a risk of variability in both the mark and the feedback given. Standardising the assessment approach was the first priority and then feeding this back effectively to the students became the second priority.
There are 5 aspects to the approach:
- The problem (assignment) definition is very clearly defined with the student being informed on every aspect that is going to be assessed. As a basic example, let the assignment be ‘Design an autonomous car that uses a motor for drive, a servo motor for steering and three types of sensor, these being a light sensor, a bumper sensor and an ultrasonic sensor. You should demonstrate your car is functional by giving a working demonstration in the simulator and explain your design choices.’
- The marking criteria is then clearly defined for each marker. The marking is kept very simple and is group into relevant sections. Using the above example, the criteria would be something like: For the ultrasonic sensor – 3 marks for a well explained, well demonstrated implementation, 2 marks for a demonstration but no explanation, 1 mark for design but no description or demonstration, 0 marks for no attempt. This is done for every sensor and then an overall mark returned for the sensor aspects of the assignment (and similarly for motors, programming, etc).
- Marks for each section are not returned as this has raised a number of complaints from students in the past, generally of the form, “Why did my friend get 1 more mark on this aspect when our work was very similar”. Instead the overall mark for each section is used as a guide as to which standard text phrase to be used as feedback. These standard phrases are programmed into an Excel spreadsheet so that feedback is automatically generated simply from the marks entered. For example, if the sensor section received marks of 3 + 1 + 1 = 5, the criteria is set so that if marks are between 3 and 6 the feedback is “The sensor system aspect was generally done well but you need to include more demonstrations of it working in your submission” whereas if the total score had been 7 or over the feedback is “The sensor system aspect was done to a very high standard, well done.” All the comments from each section are then grouped to give an overall feedback statement. Additional comments may be manually added into the spreadsheet where appropriate, for example “Video quality was too poor which hindered effective viewing of the demonstrations.”
- Students receive their mark and feedback (currently done by a mail merge as Canvas does not have an automatic upload facility for text) and are given the opportunity to receive additional feedback if they wish.
- For students requesting additional feedback, more detailed feedback is given on the particular sections where the student did most poorly in, for example if the sensor part was a low scoring section, this individual feedback (which is now manually written for each case individually) may say “The weakest part of your assignment was the sensor section, the ultrasonic sensor part was done very well however you failed to adequately demonstrate the other sensors”. The weakest aspects are easily identified from the marking sheet that contains the full breakdown of marks for every aspect and so this second level of feedback can be manually done reasonably quickly. If the student wants even more feedback, they are then invited for a face-to-face meeting to go through their presentation for a full discussion of their submission.
Why did you do it?
Initial attempts at setting an assignment with a single marker (to ensure marking consistency) required a short submission video. Students complained that they did not have enough time to demonstrate all the work they had done so submission was extended to a nominal 10 minute video (up to 12 minutes being acceptable). However, due to large cohort numbers (303 students across the two modules MEC3027 and MEC8058 for 2023 – 2024 equating to a total video watching time of 60 hours), assessment must be divided across several markers. As marking of the required design work can be highly subjective in places, it was important to ensure both consistent marking and consistent feedback to the students, particularly considering how much effort the students put in to complete the work.
Does it work?
The approach works well because all students see in which areas they did well or poorly, and this information appears to be sufficient for most of the students. Only a limited number of students requested the additional feedback (14 from the 303 cohort) after which only 2 then requested a face-to-face meeting (both of these students actually did very well and just wanted to know how to get closer to that ‘perfect’ mark).
Of those 14 requesting the additional feedback, the main complaint was comparison with a friend’s attempt where (by using the above as a generic example), a student would score 6 and get feedback of ‘a good attempt’ whereas their friend would get 7 and get ‘excellent attempt’ as feedback despite their attempts being similar. Introducing a smoother transition between ranges (either by using more ranges or possibly introducing AI generated feedback) would help to alleviate this situation, this is one aspect for future development.
The Graduate Framework
This project demonstrate the following attributes:
- Future focused
- Creative, Innovative and Enterprising
- Digitally capable
- Engaged