PhD

Title: ENHANCING THE USER EXPERIENCE OF THE EVALUATION OF INTERACTIVE, NARRATIVE BASED SYSTEMS

Abstract: Questionnaires are habitual choices for many user experience evaluators, providing a well-recognised and accepted, fast and cost effective method of collecting and analysing data. However, despite frequent and widespread use in evaluation, reliance on questionnaires can be problematic. Satisficing, acquiescence bias, straight lining and suboptimal responses are common response biases associated with questionnaires, typically resulting in the provision of poor quality data. These problems can relate to a lack of engagement with evaluation tasks, yet previous research has not focussed on alleviating these limitations by making questionnaires more fun or enjoyable to enhance participant engagement.

This research seeks to address whether ‘evaluation can be designed to be as engaging as the interactive application being evaluated?’ The aim is to investigate if the quality of data provided can be improved through enhanced questionnaire design to maintain participant engagement, resulting in questionnaire completion being more enjoyable and thus reducing common response biases. The evaluation context for this study was provided by MIXER, an interactive, narrative-based application targeting intercultural conflict resolution for 9-11 year old children that was to be both used and evaluated in the classroom context.

A series of mixed methods studies investigated evaluation techniques and approaches with consideration of participant engagement. These initial studies informed the design of a set of evaluation materials created in the form of three comic style workbooks that were used in evaluation studies of MIXER. Results demonstrate that by making questionnaire completion more enjoyable data quality is improved. Response biases are reduced, quantitative data are more complete and qualitative responses are more verbose and meaningful compared to standardevaluation techniques. Further, children reported that completing the questionnaires was a fun and enjoyable activity that they would wish to repeat in the future.

As a discipline in its own right, evaluation is under-investigated. This research provides a significant contribution to the field, highlighting that the outputs of evaluation with questionnaires are improved when participant engagement informs questionnaire design. The result is a more positive evaluation experience for participants and in return a higher standard of data provision for evaluators and R&D teams.