For the 2023-2024 and current 2024-2025 project, I am now part of the frontend software development team.



NASA Spacesuit User Interface Technologies for Students (SUITS)
NASA SUITS is an annual design competition to create a fully functional augmented reality Heads-Up-Display (HUD) for astronauts on future Artemis missions. Our design, based on a groundwork of research and feedback from astronauts and NASA employees, was chosen as one of ten finalists to be presented, tested, and evaluated by NASA staff at the Johnson Space Center Rock Yard in Houston.
Split into four subteams to meet the required tasks for the product, I was the design lead for the ROVER operation segment of our system.
Split into four subteams to meet the required tasks for the product, I was the design lead for the ROVER operation segment of our system.
Project Managers: Ashley Fan, Michael Wang, Jess Young
Design Leads: Ryan Lee, Bill Xi, Bryce Yao, Dan Luo, Dong Yoon Shin, Keya Shah, Linlin Yu, Pei-Jung Hsieh
Development Leads: Jamie Chen, Danielle Kim, George Xu, Martin Ma, Julius Beberman
Faculty Advisor: Michael Lye
Design Leads: Ryan Lee, Bill Xi, Bryce Yao, Dan Luo, Dong Yoon Shin, Keya Shah, Linlin Yu, Pei-Jung Hsieh
Development Leads: Jamie Chen, Danielle Kim, George Xu, Martin Ma, Julius Beberman
Faculty Advisor: Michael Lye
Challenge
In order to streamline and enhance the tasks performed during Artemis lunar extravehicular activities (EVA's), utilize novel AR technology to reimagine future astronaut's spacewalk experiences.

Main Design Objectives

Visual Accessibility
Interface must account for extreme lighting conditions on the moon by enhancing and not obstructing field of vision.

Tactile Accessibility
Interaction design must account for the astronaut's limited hand mobility when wearing a spacesuit.

System clarity
System must be harmonious with the astronaut's behavior to provide a safe and efficient experience under high-risk conditions on the lunar surface.
Design Requirements

Egress
Egress marks the beginning of the astronauts’ lunar journey. The interface should ensure proper sequencing and performance of procedures while minimizing risk for human error.

Navigation
This component should efficiently guide astronauts across the lunar site, carry out mission tasks at marked points of interest, and return back to the lander safely by closely monitoring terrain anomalies.

ROVER Commanding
The interface should allow an astronaut to intuitively and precisely direct an autonomous ROVER from its original location to a point of interest, monitor its status, and recall it back to the user's current coordinates.

Geological Sampling
An astronaut should be able to collect lunar rock sample information such as sample #, lithology, and timestamps using an RFID hand tool. The data collected should be saved and accessible in a separate section within the interface.
Final Design Features
Easy Access Palm Menu
Due to limited tactile mobility, we made the main menu accessible with a simpler motion – simply flip your hand to open the Palm Menu (shown on right).


Palm Menu Functions

Accessing Palm Menu
Egress Task Guidance
- Procedures on task list turn green with a check mark once the associated analog switch on an Umbilical Interface Assembly (UIA) inside the airlock has been flipped.
- Before a user can progress to the next task, a confirmation of "proceed to procedure X of 9" appears.
- Task list's EVA information can be accessed any time on the palm menu.


UIA Panel Layout

UIA Panel Interaction

Testing by Design Evaluator
ROVER Commanding
- Click ROVER command button to drop a ROVER destination. This will direct the ROVER to begin moving.
- Palm Menu reveals ROVER return button to direct the ROVER to your current GPS coordinates.
- ROVER status and operation is displayed on the integrated Navigation map.
- Self-Center button is available to re-center the map position around user's coordinates.

ROVER (Remotely Operated Vehicle in Extended Reality) provided by NASA

Dropping ROVER Waypoint & Map Self-Centering
Navigation Map and Compass
- Compass placement stays invisible to the user's field of vision unless accessed: when the compass isn't directly needed, it hovers slightly above the field of vision. When desired, a slight head title is all that is required to bring it into a direct line of sight.
- Placing a waypoint or hazard will mark that secific location on the map.

User placing custom Waypoint on Map

Main Navigation Interface with Compass

Map Buttons
Geological Sampling Tool
- Enter or disengage geological sampling session through the palm menu.
- Sample rock data is collected using an RFID Scanner connected to the telemetry stream.
- To prevent accidentally ending a session, the user must confirm the session ending to return back to the navigation home screen.

Using RFID Scanner on a sample rock
Project Timeline

Research Key Insights
Our design team utilized resources from the previous 2020-2022 design team to inform our problem discovery and definition of goals for the new design. These materials included:

Technology Research and User Interviews

Takeaways from the previous Product Results
The 2020-2022 design team interviewed specialists in pertinent fields such as astronauts, geological scientists, cartographers, XR specialists, and UIUX designers. These insights served as research starting points for our 2022-2023 challenge and helped determine what further research the team should conduct.

James H. Newman
Former NASA Astronaut

Steve Swanson
Retired NASA Astronaut

Jim Head
Geological Sciences

James Russell
Earth, Env, Planetary Sciences

Peter H. Schultz
Geological Sciences

Jonathan Levy
Cartographer

Isabel Torron
UX Designer

Alejandro Romero
VR/UX Specialist
Insight 1: Visual Accessibility
Limited visibility on the moon significantly impairs an astronaut's understanding of distance and perspective.
- What's closer to the user is more important, which means it should be higher in the visual hierarchy (Romero).
- "Be as minimal as you can, it might not be the best for all controls (on the interface) to disappear, but it can be helpful for organization" (Levy).

Insight 2: Tactile Accessibility
Astronaut suits provide limited tactile mobility, so the hand movements we implement are selectively limited.
- "With bulky gloves, there's no tactile feedback" (Swanson).
- "The main challenge (of this project) are the gloves because they are airtight and large, so mobility is tough" (Torron).
- "Use bigger hand movements (palm motions, pointing, etc.)" (Romero).

Insight 3: Clarity
Task procedure and communication clarity is necessary during extravehicular activities (EVA's).
- A procedure list and suit status extremities notice is necessary (Torron).
- Checklist relies on memorization - it would be helpful to improve checklist as it currently looks like a "poorly designed book" (Newman).

Design Process
The team managers divided the design team into 4 sub-teams, each corresponding to one of the 4 main design objectives: Egress, Navigation, ROVER Commanding, and Geological Sampling. Each sub-team was responsible for the research, iteration, and design of their main design objectives while cohesively working together to share, combine, and revise ideas to create an integrated and logical final product.
As design lead for the ROVER Commanding sub-team, I oversaw the conception and development of the feature throughout the design life cycle. I closely communicated between the Navigation design team and the Development team to reach a solution that was robust, functional, and efficiently incorporated within the larger mapping system.
As design lead for the ROVER Commanding sub-team, I oversaw the conception and development of the feature throughout the design life cycle. I closely communicated between the Navigation design team and the Development team to reach a solution that was robust, functional, and efficiently incorporated within the larger mapping system.
Iteration Timeline
The design iteration process occurred from October 2022, with our preliminary design proposal submitted to NASA, to April 2023, before finalizing a product to send to Houston for Test Week in May 2023.

Iteration 1
Design Proposal

- Employed blue sky approach based on NASA's Mission Description and the functions of existing technology
- ROVER Commanding ideas considered utilizing a peripheral joystick, ray-cast & slider drag gestures, pin pinch & drag gestures, surface magnetism object manipulation, voice commands, and location presets.
- Discussed features and tested HoloLens capabilities. Reviewed the downfalls of 2020-2022’s design
Iteration 2
Figma Prototyping

- Created higher fidelity prototypes to communicate and iterate concepts within Figma
- Began development for the Hololens on the MRTK2 (Microsoft Mixed Reality Tool Kit) Unity API using Figma Bridge.
- I consolidate the ROVER Commanding ideas into two input methods: voice input of either directions or preset locations, and ray-cast from palm onto the physical environment surface.
Iteration 3
Design Revisions

- The team implements a unified design system based on the MRTK2 asset package as well as custom-designed icons.
- Based on input from developers, I evolve the ROVER command action to utilize a virtual directional pad to allow the astronaut to fine tune the destination pin. Alternate methods are abandoned to account for inaccuracies with the Hololens' gesture tracking and consideration of astronaut's limited fine motor control. All interactions are now point & click which we believe is the simplest gesture for the user to acuate accurately.
After reviewing our design with NASA Mentor Skye Ray, I initiated the decision to merge the Navigation and ROVER processes into the same system state / map menu for a more intuitive, streamlined, and logical user flow.

Iteration 4
User Testing
.png)
- We design a 30 minute user interview procedure for participants to evaluate our current iteration using a functional Figma laptop prototype.
- After reviewing our design with NASA Mentor Skye Ray, I initiate the decision to merge the Navigation and ROVER processes into the same system state/ map menu for a more intuitive, streamlined, and logical user flow.
Iteration 6
User Testing 2

- We create a custom design system to further improve readability and contrast
- Developers work on Unity Processing and latency improvements
Iteration 6
Field Testing (2 Rounds)

- We travel to Beavertail State Park in Jamestown, RI at night to test the visibility of the color scheme, using light strobes to simulate a lunar environment with harsh directional lighting
- Developers work on issues wih the GPS system and the Telemetry Stream Server
- We discover key pain points of the existing prototypes: glitching, icon clarity, etc.
- We ensure that the minimum viable product (MVP) could be delivered for test week
Preliminary User Flows
Before / during our initial implementation of ideations into Figma, we created user flows with two main ideas in mind:
- Incorporating research insights of plausible features and existing technology that we could take inspiration from.
- Prioritizing which existing or additional features to incorporate based off of last year's testing results and NASA feature requirements.

Egress

Navigation

ROVER Commanding

Geological Sampling
Throughout team ideations, we decided on core initial design principles to follow throughout the product:

Universal Functions
Functions shared across all screens (ex. compass, main menu) should be cohesive.

Hand Triggers
To address the lack of tactile mobility, the same actions should be triggered by the same hand movements.

Order of Operations
Our design should follow the given sequential order (egress first, etc) rather than being freeform.
Initial User Testing
During Iterations 4 and 5, I completed initial experience testing to inform future design revisions.
- I organized and led interviews for 2 out of 5 total adults, aged 18-50.
- Tested on a Figma desktop prototype designed to follow along with a scripted procedure.
RISD Professors were chosen due to their age similarities to the target astronauts and their logistical availability. From the previous year's interviews, it was determined that astronaut's perceptions of AR interfaces were comparable to those of ordinary people: they do not have significant prior exposure to the interfaces. While testing a subject with knowledge of EVA tasks would be ideal, our tests still allowed us to evaluate the performance of the UI, even if they were unable to provide insight into further in-spacewalk considerations.

Cheeny Celebrado-Royer
Assistant Professor, RISD

Leah Beeferman
Assistant Professor, RISD

Testing Insights

Contrast
Overall design needs higher color contrast and larger font sizes.

Icon Clarity
Some icons were difficult to identify. Icons should look more cohesive.
.png)
Feature Consolidation
The ROVER and Navigation feature merge consistently yielded successful user task completions!
Hololens Field Testing
At the final stages of iteration 6, we initiated two rounds of field testing with our design coded into the Hololens. We sent a team of designers and developers to Beavertail State Park in Jamestown, RI at night to test the visibility of the color scheme, using light strobes to simulate a lunar environment with harsh directional lighting.
For the first round of field testing, I was one of the designers responsible for documenting insights on the program's performance and ensuring the essential requirements of the product would be met.
For the first round of field testing, I was one of the designers responsible for documenting insights on the program's performance and ensuring the essential requirements of the product would be met.

Simulating the moon's environment with high-powered torch

Trying on the display in night conditions for the first time

Testing interaction ergonomics & performance
Revised User Flow (ROVER Step)

Design System
Throughout the design process our team collaboratively evolved the MRTK3 preset assets into a custom design system that achieve our goals of visibility in a high-contrast environment, unification within our icon language, and clarity of the displayed information.

Within the team design system, I was responsible for implementing our design goals for the ROVER operational icons, merging the existing language of the navigation pin with a custom visual to communicate the link to the ROVER.

Deployment at NASA SUITS Test Week
As finalists for the NASA SUITS Design challenge, our team had the chance to present, test, and receive evaluation and feedback from NASA staff at the Johnson Space Center Rock Yard during the SUITS test week from May 18-23, 2023.
As part of the program, NASA only allowed five team members to enter the test site, so unfortunately I was unable to participate in the test week activities.
Below is a review of the testing team's work on the design during the week's proceedings.
As part of the program, NASA only allowed five team members to enter the test site, so unfortunately I was unable to participate in the test week activities.
Below is a review of the testing team's work on the design during the week's proceedings.
As part of the program, NASA only allowed five team members to enter the test site, so unfortunately I was unable to participate in the test week activities.

Test Week Footage
The activities were broken up into three sections: briefing, testing, and debriefing.
Briefing

Onboarding NASA design evaluator Kelly Mann, equipped with a custom-built high beam apparatus for improved visibility
Testing

NASA mentor Skye Ray tests our design (In the image above, he's testing Egress using the mock UIA)
Debriefing

Ray provides feedback using his experience as a NASA UX Designer
Test Week Revisions
Despite our rigorous tuning in Providence, the team additionally made small design changes during testing due to advice from evaluators and unforeseen new directives ordered from NASA.

ROVER Revisions
Upon arrival at Houston, NASA specified two new requirements for the ROVER operation system that were not disclosed prior to the testing week:
- To display the explicit geographic coordinate locations of the ROVER destination pin within the interface
- To code preset ROVER destination points for the astronaut to choose, obtained from a list of 6 coordinates within the Rock Yard provided by NASA.
Ironically, I had considered the solution of implementating preset destination points from the very first iteration!

An early ROVER commanding procedure that I created in January 2023, that implemented a preset point selection method
The Team's Solution

ROVER destination coordinates placed at an unintrusive location on the navigation map

ROVER destination presets added to the map in a low-opacity inactive state
I found this instance to be an exceptional real-world example of how design is a cyclical, non-linear process, in which abandoned branches from previous iterations are re-evaluated at a later stage and found to be advantageous to the solution even when other ideas have been explored and pursued as well. In this case, the team returned to the input system idea that we had moved on from for months.

Our ROVER Commanding solution demonstrates a cyclical, non-linear design journey
Takeaways
Working as a subsystem team lead all the way to the completion of the project, I acquired countless new problem-solving strategies and adaptive approaches to team workflows. My top lessons can be summarized into three takeaways:
Takeaway 1
Team projects require a wider awareness of each members' unique perspectives in order to determine the best strategy to exchange ideas.
- Many project setbacks occurred due to a misunderstanding within the design team or between the designers and the developers.
- Through my interactions navigating through our project, I learned the challenge and importance of communicating in ways that could be understood by the receiving end.
Takeaway 2
While employing a highly iterative mindset ensures consideration of a wide selection of ideas, it is essential to then employ empirical evidence to evaluate their quality.
- A piece of advice that our NASA mentor Skye Ray bestowed onto our team that stuck with me was that an essential method of decision-making is through objective testing, not just subjective speculation.
- When working with a team of designers who all had different opinions on where our design should go, I found that citing proven facts was the most effective way to support my argument and coordinate a team consensus.
Takeaway 3
Unlike personal projects, when working on a long-term group project it's important to balance your individual passion with the team benefit of your work output.
- When working on intersecting features within the product, or documenting my features for others to implement when I was unable to travel to Houston for test week, I had to let go and pass off my effort to my team partners.
- It's counterintuitive, but in the circumstances where someone else held a better idea, superior expertise, or greater familiarity with the problem, the most valuable thing I could do for the team was to step back and let them take ownership of the task.
- Allowing team members' personal strengths to shine resulted in both a more functional group workflow and a more robust product solution.
Next Steps
After my experiences from the 2022-2023 season, I have continued my contributions to the RISD SUITS team this as a software developer for the 2023-2024 season, and for the current 2024-2024 season.
I chose to move from being a design subsystem lead to a developer to gain a more holistic understanding of the entire product development process, enhancing the contributions that I can make as a designer, both for this team and future organizations that I'll be a part of.
I chose to move from being a design subsystem lead to a developer to gain a more holistic understanding of the entire product development process, enhancing the contributions that I can make as a designer, both for this team and future organizations that I'll be a part of.
