The advent of cyber-physical systems, such as robots and autonomous vehicles (AVs), brings new opportunities and challenges for the domain of interaction design. Though there is consensus about the value of human-centred development, there is a lack of documented tailored methods and tools for involving multiple stakeholders in design exploration processes. In this paper we present a novel approach using a tangible multi-display toolkit. Orchestrating computer-generated imagery across multiple displays, the toolkit enables multiple viewing angles and perspectives to be captured simultaneously (e.g. top-view, frst-person pedestrian view). Participants are able to directly interact with the simulated environment through tangible objects. At the same time, the objects physically simulate the interface’s behaviour (e.g. through an integrated LED display). We evaluated the toolkit in design sessions with experts to collect feedback and input on the design of an AV-pedestrian interface. The paper reports on how the combination of tangible objects and multiple displays supports collaborative design explorations.
Project Partners
Key Publications
Marius Hoggenmueller, Martin Tomitsch, Callum Parker, Trung Thanh Nguyen, Dawei Zhou, Stewart Worrall and Eduardo Nebot. A Tangible Multi-Display Toolkit to Support the Collaborative Design Exploration of AV-Pedestrian Interfaces . In Proc. of the 32nd Australian Conference On Human-Computer Interaction (OzCHI’20)
Martin Tomitsch, Marius Hoggenmueller. Designing Human-Machine Interactions in the Automated City: Methodologies, Considerations, Principles. In Automating Cities: Robots, Drones and Urban Data in the Design, Construction, Operation and Future Impact of Smart Cities, Wang, B. T., Wang, C. M., Eds., Springer Verlag, Singapore.