Fully autonomous vehicles have the potential to not only mitigate accidents caused by human errors, but also fundamentally transform the way people commute in cities. Recent endeavours from government institutions and industry indicate a trend towards shared autonomous vehicles (SAV) as a likely future mobility scenario, rather than people owning their personal vehicles. Thereby, the deployment of SAV services can have a positive impact on the quality of urban life with reduced need for parking spaces and less congestion.
The ubiquitous roll-out of autonomous vehicles is closely linked to overcoming technological challenges. At the same time, considering the human factor, including those affecting people outside the vehicle, has gained attention from industry and academia: for example, how can an autonomous vehicle communicate to surrounding pedestrians?
In this research project we designed a low-resolution lighting-based external human-machine interface (eHMI) for a shared passenger transport pod. To evaluate the eHMI, we devised a ride-sharing scenario with multiple vehicles commuting in a shared urban environment where pedestrians, cyclists and maintenance vehicles share the same road.
Evaluating Interactions with Autonomous Vehicles in Virtual Reality
One aspect of this research was to investigate which simulation platform and prototyping representations to use for evaluating interactions between autonomous vehicles and pedestrians. We implemented a mixed-methods study in which we compared three prototype representations: real-world VR (referred to as RW-VR), computer-generated VR (CG-VR) and real-world video (RW-Video), hence adopting two simulation platforms (VR and video). RW-VR is increasingly used in simulation studies given that 360-degree cameras are becoming more affordable and widely available, and due to the higher level of fidelity of real-world video. CG-VR is a commonly used representation in pedestrian-AV safety research. RW-Video was included as video prototypes can be useful when evaluating context-based interfaces online. For creating the RW-VR prototype representation, we worked in close collaboration with researchers from the engineering department of our university. We used a fully functional AV developed by the engineering department as a cooperative autonomous electric vehicle (CAV) platform.
Quantitative results showed that while the real-world VR representation results in higher sense of presence, no significant differences in user experience and trust towards the autonomous vehicle were found. However, interview data shows that participants focused on different experiential and perceptual aspects in each of the prototype representations. These differences are linked to spatial awareness and perceived realism of the AV behaviour and its context, affecting in turn how participants assess trust and the eHMI. The paper offers guidelines for prototyping and evaluating context-based interfaces through simulations.
Project Partners
Key Publications
Marius Hoggenmueller, Martin Tomitsch, Luke Hespanhol, Tram Thi Minh Tran, Stewart Worrall, Eduardo Nebot. Context-Based Interface Prototyping: Understanding the Effect of Prototype Representation on User Feedback. In Proc. of the ACM CHI Conference on Human Factors in Computing Systems (CHI’21)
Martin Tomitsch, Marius Hoggenmueller. Designing Human-Machine Interactions in the Automated City: Methodologies, Considerations, Principles. In Automating Cities: Robots, Drones and Urban Data in the Design, Construction, Operation and Future Impact of Smart Cities, Wang, B. T., Wang, C. M., Eds., Springer Verlag, Singapore.