In August 2023, I joined the CLAWS team at the University of Michigan to take part in the NASA Suits the Challenge. This design challenge engages college students nationwide in devising user interface solutions for forthcoming spaceflight requirements. Our objective was to develop and deploy an augmented reality (AR) system aimed at assisting astronauts in conducting extravehicular activities (EVAs), including tasks like spacecraft ingress and egress, geosample collection, navigation between points, and more.
I participated in design workshops and ideation sessions aimed at brainstorming visual concepts for AR design. I acquainted myself with AR design frameworks and as the project progressed, I utilized user flows to chart out the system's flow for the user. For the AR design, I leveraged the MRTK3 design system provided by Microsoft. The project was completed in 3 pahses. We conducted design sprints and rigorous usability testing sessions. Our approach was iterative, involving in-person moderated testing sessions and design iterations.
I learnt the start and finish of an AR design process, shaping the visual and narrative aspects from scratch. Using an existing design system showed how to introduce new UI elements effectively. Found efficient ways to hand-off UX designs to AI and AR teams. Working with a team of 60 people from different departments felt like being in a startup. Watching how teams worked together was a valuable lesson during the project.
Our design proposal made it to the top 10 teams in the USA, earning us a spot in the final presentation at NASA where our fully implemented AR system, IRIS, was showcased. The NASA panel was impressed by our presentation and appreciated CLAWS for its continuous improvement in solutions each year at the NASA Suits Challenge.
The NASA SUITS (Spacesuit User Interface Technologies for Students) Design Challenge is for student teams to design and create space suit information displays within an augmented reality (AR) environment. These display and audio environments are intended to aid astronauts in performing spacewalk tasks.
During the extravehicular activities (EVAs), astronauts wear next-generation spacesuits as they explore the martian surface, conduct scientific research, and interact with various lunar assets such as rovers, geology tools, landers, power systems, science payloads, and habitats. The challenge recognizes the potential of AR technology to provide astronauts with real-time visual displays of valuable data from these assets, thereby enhancing their task performance, reducing workload, and improving situational awareness.
Exploring the future of Augmented Reality (AR) with the Microsoft HoloLens.
Before delving into designing for the HoloLens, it was crucial for me to grasp and establish design constraints and considerations across various elements of the device such as interaction styles, touch targets, sound usage, and voice interaction capabilities.
Regarding voice interactions, our goal was to integrate voice commands for astronauts' use. Guidelines set for these include:
We also sought to assess the reliability of voice interactions in the Martian environment, including examining potential obstructions caused by Martian surface elements such as wind and rover noises.
Given the constraints of astronauts wearing bulky spacesuits and the criticality of feedback in space environments, we established specific guidelines for integrating sounds in a subtle and unobtrusive manner across all scenarios.
Within immersive environments, screens are categorized into two primary types: body-locked and world-locked. Each screen we design necessitates careful consideration to determine whether it should be body-locked or world-locked.
As a UX designer within the Collaborative Lab for Advancing Work in Space (CLAWS), I contributed to the development of the Immersive Reality Interplanetary System (IRIS). IRIS serves as an interactive augmented reality interface specifically designed to assist astronauts during extravehicular activities (EVAs) on Mars.
My responsibilities included designing three key features of IRIS:
Exhibit A : Main Menu
Exhibit B : Geo-sample data collection for astronauts
An essential duty for astronauts involves gathering samples from the Martian surface. Equipped with an XRF spectrometer, they scan these samples, facilitating automatic population of certain information. However, additional details require manual input by the astronaut. I was assigned the responsibility of designing the input fields for these details and visualizing the process for astronauts' use ensuring ease of use and efficiency during sample collection tasks.
The UI design of each AR element adhered to the MRTK3 design system. These screens were specifically crafted to facilitate manual input by astronauts for various fields not automatically populated by the XRF spectrometer.
The primary objective was to ensure seamless interactions with minimal steps required. Key features include:
Recognising the importance of feedback, we wanted to make sure we show the XRF readings in progress while it scanned, filling in the information based on the readings they got from the device.
This represents the completed UI state, accessible through the saved geo-sample database.
Exhibit C : Guide to exit spacecraft
The egress process comprises the following steps:
We organized structured, face-to-face usability tests for the IRIS system, which relies on completing tasks with as few steps as possible. During these sessions, participants were assigned tasks, and their actions on the Headset Displays were projected onto a separate screen for observation. While they performed these tasks, we also asked them about their comfort level, ease of use, and any points of confusion they encountered. Detailed notes were taken during the tests and later shared with all teams for refining the design.
The close button in the IRIS modes menu shares the same design as other mode buttons, making it difficult for users to distinguish between the mode options and the close button.
To enhance the close button's UI and create a clear distinction, the initial idea was to remove the subtle backplate. However, considering the design system, the backplate signifies the element's clickability. Hence, I opted to retain the backplate but modify its appearance.
In certain areas of the system, consistency was disrupted, as some buttons included both text and icons while others included only text.
To address the inconsistency, we standardized the design approach for all buttons. We included icons to all buttons where they were absent. Icon+text in general is easier and faster to comprehend.
We identified that the complexity of the IRIS system led to a significant number of users frequently asking questions, primarily because they lacked understanding of specific features or struggled to navigate through the system seamlessly.
This prompted us to consider implementing an onboarding feature within the system. However, a crucial challenge to bear in mind is that the onboarding cannot occur after astronauts have landed on the Martian surface and are setting up their HMD (Helmet-Mounted Display) due to the limited time available for Extra-Vehicular Activities (EVAs). Therefore, a pre-mission onboarding process became essential. This was marked as a future step following the primary system design phase.