This paper is published in Volume-7, Issue-1, 2021
Area
Computer Science
Author
Prabhath Mannapperuma, Munasinghe T. S., Wijetunga D., Perera M. M. I., Jagath Wickramarathne
Org/Univ
Sri Lanka Institute of Information Technology, Colombo, Sri Lanka, Sri Lanka
Pub. Date
10 February, 2021
Paper ID
V7I1-1226
Publisher
Keywords
Automated User Experience Testing, User Experience, Remote User Testing, Usability Issues

Citationsacebook

IEEE
Prabhath Mannapperuma, Munasinghe T. S., Wijetunga D., Perera M. M. I., Jagath Wickramarathne. Automated User Experience Monitor (A.U.X.M), International Journal of Advance Research, Ideas and Innovations in Technology, www.IJARIIT.com.

APA
Prabhath Mannapperuma, Munasinghe T. S., Wijetunga D., Perera M. M. I., Jagath Wickramarathne (2021). Automated User Experience Monitor (A.U.X.M). International Journal of Advance Research, Ideas and Innovations in Technology, 7(1) www.IJARIIT.com.

MLA
Prabhath Mannapperuma, Munasinghe T. S., Wijetunga D., Perera M. M. I., Jagath Wickramarathne. "Automated User Experience Monitor (A.U.X.M)." International Journal of Advance Research, Ideas and Innovations in Technology 7.1 (2021). www.IJARIIT.com.

Abstract

User experience is a crucial element in any kind of software. If the user cannot easily and satisfactorily use the software, they will not be inclined to use that software. Therefore, testing the user experience of software is important. Conducting usability evaluations can be a tedious matter however since there are several steps involved in setting up even a single usability test. This relates more to scheduling meetings between the test user and the expert evaluator that will conduct the test. Although there are several commercial software that allows designers to “relive”, or observe how users use their software, there are no such products that allow usability testing to be conducted, with the role of an expert evaluator being fulfilled. This paper describes a novel approach, the Automated User Experience Monitor (AUXM) system, where usability testing can be conducted in the absence of an expert evaluator, thus eliminating the issue of setting up meetings between test users, and expert evaluators. It is a system that allows designers/developers to have their user interfaces tested for usability. The designer of the prototype screens is able to feed their designs to the system described in this paper, and then invite any number of users to test the usability of those designs. While the users perform the tasks provided to them by the designer, their emotional feedback, keystrokes, mouse clicks, and verbal utterances are recorded. The recorded data is used by the system to inform the designers what went wrong and suggest how to fix the issues in the designs. Results show that such a system is indeed possible to be implemented, and that usability analysis could be performed much easier using the system than having expert evaluators and test users meet in person. In conclusion, if the proposed approach is developed further, and refined, usability testing could change significantly in the software production industry.