AI-Driven Usability Testing: Integrating Eye-Tracking Data and Agentic Systems for Automated UI Evaluation

Mehweesh Tahir Kadegaonkar*, Kayvan Karim

*Corresponding author for this work

Research output: Contribution to conferencePaperpeer-review

Abstract

Despite the benefits of user interface/experience (UI/UX) design, traditional usability testing remains resource-intensive and repetitive. This study proposes a novel system that integrates real-time browser-based eye-tracking with a multimodal agentic framework to automate UI evaluation. Participants interacted with task-specific interfaces while their gaze data was captured and analysed by a multi-agent system to generate structured usability reports grounded in heuristic principles. Precision metrics were used to quantify qualitative insights, enabling measurable evaluation. To enhance accessibility, a comparative analysis was conducted between proprietary and open-source Large Language Models (LLMs). Results showed that proprietary models consistently delivered accurate insights, whereas smaller local models struggled with reliability — highlighting future directions for offline deployment. The findings contribute to the advancement of AI-driven solutions in usability evaluation, showcasing how agentic systems integrated with browser-based eye-tracking tools can overcome traditional limitations.
Original languageEnglish
Publication statusAccepted/In press - 21 May 2025
EventAssociation for the Advancement of Artificial Intelligence Syposium on Human-AI Collaboration 2025: Exploring diversity of human cognitive abilities and varied AI models for hybrid intelligent systems - Heriot-Watt Campus, Dubai, United Arab Emirates
Duration: 20 May 202522 May 2025

Conference

ConferenceAssociation for the Advancement of Artificial Intelligence Syposium on Human-AI Collaboration 2025
Abbreviated titleAAAI SuS 2025
Country/TerritoryUnited Arab Emirates
CityDubai
Period20/05/2522/05/25

Fingerprint

Dive into the research topics of 'AI-Driven Usability Testing: Integrating Eye-Tracking Data and Agentic Systems for Automated UI Evaluation'. Together they form a unique fingerprint.

Cite this