NASA SUITS
Augmented Reality (AR) for Astronauts

Speed Read

In a rush? Here's the gist.
Challenge

In August 2023, I joined the CLAWS team at the University of Michigan to take part in the NASA Suits the Challenge. This design challenge engages college students nationwide in devising user interface solutions for forthcoming spaceflight requirements. Our objective was to develop and deploy an augmented reality (AR) system aimed at assisting astronauts in conducting extravehicular activities (EVAs), including tasks like spacecraft ingress and egress, geosample collection, navigation between points, and more.

Process

I participated in design workshops and ideation sessions aimed at brainstorming visual concepts for AR design. I acquainted myself with AR design frameworks and as the project progressed, I utilized user flows to chart out the system's flow for the user. For the AR design, I leveraged the MRTK3 design system provided by Microsoft. The project was completed in 3 pahses. We conducted design sprints and rigorous usability testing sessions. Our approach was iterative, involving in-person moderated testing sessions and design iterations.

Takeaways

I learnt the start and finish of an AR design process, shaping the visual and narrative aspects from scratch. Using an existing design system showed how to introduce new UI elements effectively. Found efficient ways to hand-off UX designs to AI and AR teams. Working with a team of 60 people from different departments felt like being in a startup. Watching how teams worked together was a valuable lesson during the project.

Impact

Our design proposal made it to the top 10 teams in the USA, earning us a spot in the final presentation at NASA where our fully implemented AR system, IRIS, was showcased. The NASA panel was impressed by our presentation and appreciated CLAWS for its continuous improvement in solutions each year at the NASA Suits Challenge.

Got more time? :)
Scroll down to read the entire case study
theme
Immersive Technology | Augmented Reality (AR) Design | Wearable | MRTK3 Design System | Interaction Design
When
2023 - 2024
team and role
UX Designer at CLAWS (Collaborative Lab For Advancing Work in Space), University of Michigan

NASA's Mars Mission

The NASA SUITS (Spacesuit User Interface Technologies for Students) Design Challenge is for student teams to design and create space suit information displays within an augmented reality (AR) environment. These display and audio environments are intended to aid astronauts in performing spacewalk tasks.

During the extravehicular activities (EVAs), astronauts wear next-generation spacesuits as they explore the martian surface, conduct scientific research, and interact with various lunar assets such as rovers, geology tools, landers, power systems, science payloads, and habitats. The challenge recognizes the potential of AR technology to provide astronauts with real-time visual displays of valuable data from these assets, thereby enhancing their task performance, reducing workload, and improving situational awareness.

Mission Requirements

  • Develop a UI, using the Microsoft HoloLens, or other approved AR device, enabling astronauts to finish tasks more efficiently by providing a set of audible and visual instructions and tools via the display environment.
  • The UI shall assist the astronaut during navigation, Provide EVA instructions, and display the telemetry stream data.
  • The primary stakeholders requiring design consideration are the Local Mission Control Center (LMCC) and the astronauts using the AR HMD (Head-Mounted Display).

Unique challenges on the Martian Surface

  • Communication latency between the Mission control team on earth and the crew on mars due to the large distance between both the planets.
  • The mars missions are long in duration which might require multiple days which is why the Mission control crew becomes a very important guide to the crew on mars.
  • Special focus on equipment repair and maintenance incase of Incapacitated crewmember rescue (ICR) or Loss of communication or Loss of transportation (rover rescue).
  • Dust particles in the atmosphere can easily stick to surfaces because they are slightly electrostatic. Oxidized dust particles in the air create a rusty tan hue due to scattering.

Meet HoloLens

Exploring the future of Augmented Reality (AR) with the Microsoft HoloLens.

Design Considerations

Before delving into designing for the HoloLens, it was crucial for me to grasp and establish design constraints and considerations across various elements of the device such as interaction styles, touch targets, sound usage, and voice interaction capabilities.

Interaction Styles

Scaling 

Rotating

Direct Tap

Touch Target Optimization
User Comfort
Voice Interaction

Regarding voice interactions, our goal was to integrate voice commands for astronauts' use. Guidelines set for these include:

  • Use concise commands
  • Make sure commands are non-obtrusive
  • Use visual affordances: label, quotes

We also sought to assess the reliability of voice interactions in the Martian environment, including examining potential obstructions caused by Martian surface elements such as wind and rover noises.

Sound Usage

Given the constraints of astronauts wearing bulky spacesuits and the criticality of feedback in space environments, we established specific guidelines for integrating sounds in a subtle and unobtrusive manner across all scenarios.

  • Inform and Reinforce: Use sounds to convey information and strengthen user actions.
  • Notifications: Employ sounds for notifying users of important events or updates.
  • Multi-stages: Utilize sound cues to indicate progression through different stages or levels.
  • Exercise Restraint: Be selective with sound usage; avoid overloading the user with auditory feedback.
  • Emphasize one sound at a time: Focus on highlighting one sound per interaction to prevent confusion.
Screen Types

Body-locked

World-locked

Within immersive environments, screens are categorized into two primary types: body-locked and world-locked. Each screen we design necessitates careful consideration to determine whether it should be body-locked or world-locked.

  • World-Locked: In world-locked mode, screens remain stationary in space without adapting to or tracking the user's movements.
  • Body-Locked: body-locked screens are affixed to the user's body, ensuring they move with and maintain visual contact with the user at all times.

Project Timeline

Introducing IRIS

As a UX designer within the Collaborative Lab for Advancing Work in Space (CLAWS), I contributed to the development of the Immersive Reality Interplanetary System (IRIS). IRIS serves as an interactive augmented reality interface specifically designed to assist astronauts during extravehicular activities (EVAs) on Mars.

My responsibilities included designing three key features of IRIS:

  • Designing the main menu for the entire IRIS system.
  • The interface for collecting and viewing geo-sample data by astronauts.
  • Facilitating the execution of UIA egress tasks essential for preparing to exit the spacecraft for EVA tasks.

Exhibit A :  Main Menu

  • The main menu encompasses various EVAs  (Extra-Vehicular Activities) as Nav items expected of the user during their time on Mars.
  • A design choice implemented for IRIS involved the incorporation of "IRIS modes," within which users can focus on specific tasks such as geosampling and navigating between zones without interruptions from other notifications or task-related information.

Exhibit B :  Geo-sample data collection for astronauts

An essential duty for astronauts involves gathering samples from the Martian surface. Equipped with an XRF spectrometer, they scan these samples, facilitating automatic population of certain information. However, additional details require manual input by the astronaut. I was assigned the responsibility of designing the input fields for these details and visualizing the process for astronauts' use ensuring ease of use and efficiency during sample collection tasks.

User Flow
UI Design

The UI design of each AR element adhered to the MRTK3 design system. These screens were specifically crafted to facilitate manual input by astronauts for various fields not automatically populated by the XRF spectrometer.

The primary objective was to ensure seamless interactions with minimal steps required. Key features include:

  • Information is saved automatically without needing a separate save button.
  • Astronauts can choose colors and shapes from preselected options to avoid typing.
  • They can speak their notes instead of typing them.
  • The zone name (labeled as C2) is filled in automatically based on the astronaut location.

Recognising the importance of feedback, we wanted to make sure we show the XRF readings in progress while it scanned, filling in the information based on the readings they got from the device.

This represents the completed UI state, accessible through the saved geo-sample database.

  • The "View Samples" button allows the user to see all the saved geosamples.
  • The screen is divided into three columns for simplicity.
  • In the first column, the user can choose different zones. The second column displays the names of samples based on the selected zone. The third column provides details about the selected sample.
  • This setup enables the user to view all zones and their samples at once, making it quick and easy to find the desired information.

Exhibit C :  Guide to exit spacecraft

The egress process comprises the following steps:

  • Conducting the airlock exit procedure to assess spacesuit life support systems.
  • Operating the tactical switchboard.
  • Transmitting real-time switch positions to Telemetry Stream Server (TSS).
  • Providing interaction instructions via the Head Mounted Display (HMD).
User Flow
Egress Switch Board

Image Credits: NASA internal team

UI Design
  • To facilitate the egress procedure, I designed holographic displays feature blinking arrows indicating the direction in which each switch needs to be toggled (e.g., OPEN/CLOSE or ON/OFF). This dynamic arrow design effectively directs users' attention to the switches.
  • The blinking arrows persist for a few seconds, allowing users sufficient time to orient themselves and observe the switches before they stop blinking. Once blinking stops, the physical toggle switch is in view without any obstruction for user to toggle the switch.
  • Success feedback: Upon successful manipulation of a switch in the correct direction, the holographic display associated with that switch turns green. This visual cue signifies to the user that the action was performed correctly.
  • Error feedback: In case of user error, such as toggling the wrong switch or moving it in the wrong direction, the holographic display turns red. Additionally, the system provides corrective guidance, displaying the correct interaction to be performed.
  • Some steps during the egress process need the user to wait until certain things, like supply pressure and water levels, reach the right levels before the switches can be turned OFF/CLOSE.
  • A progress bar has been added to show how far along the process is. This helps the user know when it's okay to turn the switches OFF/CLOSE hence avoiding errors.

Usability Testing + Design Iterations

We organized structured, face-to-face usability tests for the IRIS system, which relies on completing tasks with as few steps as possible. During these sessions, participants were assigned tasks, and their actions on the Headset Displays were projected onto a separate screen for observation. While they performed these tasks, we also asked them about their comfort level, ease of use, and any points of confusion they encountered. Detailed notes were taken during the tests and later shared with all teams for refining the design.

Finding #1

The close button in the IRIS modes menu shares the same  design as other mode buttons, making it difficult for users to distinguish between the mode options and the close button.

Design Change #1

To enhance the close button's UI and create a clear distinction, the initial idea was to remove the subtle backplate. However, considering the design system, the backplate signifies the element's clickability. Hence, I opted to retain the backplate but modify its appearance.

Finding #2

In certain areas of the system, consistency was disrupted, as some buttons included both text and icons while others included only text.

Design Change #2

To address the inconsistency, we standardized the design approach for all buttons. We included icons to all buttons where they were absent. Icon+text in general is easier and faster to comprehend.

Finding #3

We identified that the complexity of the IRIS system led to a significant number of users frequently asking questions, primarily because they lacked understanding of specific features or struggled to navigate through the system seamlessly.

Design Change #3

This prompted us to consider implementing an onboarding feature within the system. However, a crucial challenge to bear in mind is that the onboarding cannot occur after astronauts have landed on the Martian surface and are setting up their HMD (Helmet-Mounted Display) due to the limited time available for Extra-Vehicular Activities (EVAs). Therefore, a pre-mission onboarding process became essential. This was marked as a future step following the primary system design phase.

Key Learnings + Next Steps

  • When starting new projects, there are often many opportunities to create a product from scratch, giving you the chance to shape the narrative and visual design. It was a beneficial exercise to use an existing design system to build a product. This experience showed how a design system can be fully utilized and how it can help introduce new UI elements effectively.
  • The transition from UX to Dev remains ambiguous in the industry, particularly when it comes to 3D display designs. However, we worked together to devise an efficient approach for clearly conveying our design ideas to the teams handling AI and AR development on the backend.
  • This was my first time working with a large team of around 60 people, which included members from various departments such as Research, Marketing, Business, Hardware, Web, AI, AR, and UX. The collaborative atmosphere resembled that of a startup. Throughout the project, I had the opportunity to observe how different teams worked together, which was a valuable learning experience.
  • Augmented reality (AR) is becoming increasingly valuable in various fields such as worker training and surgery training. Its potential in outer space is particularly exciting, and I'm eager to see how NASA integrates it for their astronauts. I'm also looking forward to working on more AR projects myself.
Back to top