Motion Platform

PassengXR Open-Source Motion Platform

The Viajero project has produced an open-source hardware and software motion platform called PassengXR [1] to support ourselves and others in creating passenger experiences and interfaces that make use of – or counteract – the real motion and location of the vehicle.

Conducting research into the design of in-vehicle eXtended Reality (XR) interfaces can have high costs (e.g., driving simulators, buying/hiring real cars, expensive sensors) and be practically challenging: maintaining stable and accurate tracking of headset and vehicle, developing the necessary functionality to relay and process sensor data, having limited access to real driving data, among others.

Using off-the-shelf sensors, PassengXR gives practitioners software that is ready-made to incorporate the motion of a vehicle and an on-board XR headset into a virtual scene, to make it faster, easier and cheaper to create and test vehicular XR experiences.

Overview

Published at ACM UIST 2022 [1], PassengXR is built in Unity, and uses ESP32 Arduino-compatible IoT modules to capture, broadcast and receive vehicle telemetry wirelessly at low latency. It supports the sensing, recording and playback of all car movement (IMU orientation, OBD-II velocity, GNSS global position) for multiple co-located standalone XR headsets. PassengXR also supports a variety of approaches for maintaining headset alignment within the vehicle reference frame, enabling both motion-based and vehicle-based XR content.

The left side of the image shows a top-down view of 4 Arduino sensors: an ESP32 Thing Plus, and IMU, and OBD-II connector and a GPS/GNSS antenna. The image on the right shows a screenshot of the Unity motion platform software.

All sensor data can be recorded for analysis of participant testing, and using the same Unity scene configuration in a lab-based scenario, PassengXR can also play this recorded data back within Unity in real time, to recreate the same movements, locations, events etc as the real car journey. This means that designers who are unable to access a car can still design an interface and test how real car movements manifest in the virtual scene for local VR users. To help, we will provide three datasets recorded during real drives in three environments in the UK: country road, city road, motorway.

Access to Motion Platform Software

We are in the process of making the motion platform software available through GitHub, but in the meantime, please contact Graham Wilson for information.

The software consists of two components: 1) an Arduino project (c++) that gathers data from the vehicle IMU, OBD and GNSS sensors and sends it to 2) a Unity (C#) project running the motion platform. Documentation will be provided for both components, and the Unity project will include demonstration scenes to get you started.

[1] [pdf] M. McGill, G. Wilson, D. Medeiros, and S. Brewster, “Passengxr: a low cost platform for any-car, multi-user, motion-based passenger xr experiences,” in Uist ’22: proceedings of the 35th annual acm symposium on user interface software and technology, , 2022.
[Bibtex]
@incollection{passengxr2022,
title={PassengXR: A Low Cost Platform for Any-Car, Multi-User, Motion-Based Passenger XR Experiences},
author={McGill, Mark and Wilson, Graham and Medeiros, Daniel and Brewster, Stephen},
booktitle={UIST '22: Proceedings of the 35th Annual ACM Symposium on User Interface Software and Technology},
year={2022}
}