An automated individual feedback and marking system: an empirical study

T. Barker

Research output: Contribution to journalArticlepeer-review

8 Citations (Scopus)

Abstract

The recent National Students Survey showed that feedback to students was an ongoing problem in Higher Education. This paper reports on the extension of our past research into the provision of automated feedback for objective testing. In the research presented here, the system has been further developed for marking practical and essay questions and providing automated feedback. Recent research at the University of Hertfordshire was able to show that learners and tutors accept and value our automated feedback approach based on objective tests and Computer Adaptive Testing. The research reported in this paper is an important extension to this work. The automated feedback system developed for objective testing has been extended to include practical testing and essay type questions. The automated feedback system, which can be used within any subject area, is based on a simple marking scheme created by the subject tutor as a text file according to a simple template. Marks for each option and a set of feedback statements are held within a database on a computer. As marks are awarded for each question by the teacher an individual feedback file is created automatically for each learner. Teachers may also add and modify comments to each learner and save additional feedback to the database for later use. Each individual feedback file was emailed automatically to learners. The development of the system is explained in the paper and testing and evaluation with 350 first year (1 final practical test), 120 second year (1 written and 1 practical tests) and 100 final year (1 final practical test) undergraduate Computer Science students is reported. It was found that the time to mark practical and essay type tests was reduced by more than 30% in all cases compared to previous years. More importantly it was possible to provide good quality individual feedback to learners rapidly. Feedback was delivered to all within three weeks of the test submission date. In end of module tests it was very beneficial indeed as it had proven difficult to provide feedback in the past after modules had ended. Examples of the feedback provided are presented in the paper and the development of the system using a user-centred approach based on student and staff evaluation is explained. The comments of staff teaching on these modules and a sample of students who took part in this series of evaluations of the system are presented. The results of these evaluations were very positive and are reported in the paper, showing the changes that were made to the system at each iteration of the development cycle. The provision of fast effective feedback is vital and this system was found to be an important addition to the tools available.
Original languageEnglish
Pages (from-to)1-114
JournalElectronic Journal of e-Learning
Volume9
Issue number1
Publication statusPublished - 2011

Keywords

  • assessment
  • feedback
  • automated systems
  • development
  • evaluation

Fingerprint

Dive into the research topics of 'An automated individual feedback and marking system: an empirical study'. Together they form a unique fingerprint.

Cite this