Mixed Reality Exploration Toolkit (MRET)

A showcase of the ability to select various overlays to place over the earth, displaying scientific data sets.

The most recent public release of the Mixed Reality Exploration Toolkit can be downloaded here!

I worked for NASA at the Goddard Space Flight Center as a VR/AR Toolkit Developer.

As part of my work, I created enhancements to the NASA Mixed Reality Exploration Toolkit (MRET), a VR/AR package built for the Unity game engine.

One MRET project that needed enhancements was the toolkit’s Capture, Containment and Retrieval System (MRET-CCRS) project. The MRET-CCRS project is a digital simulation of the real Containment and Retrieval System project, which is a payload that is a part of the joint Mars Sample Return mission between NASA and the European Space Agency. CCRS is made up of a Capture and Containment Module and Earth Return Module which picks up orbiting samples to bring them back to Earth – The goal of this project is to ensure the first ever successful return of samples from Mars.

The main improvement that had to be made to the MRET-CCRS project was to have the user’s hands simulated in the in-game environment, which meant that the user’s controllers had to interact physically with other in-game objects. I called this feature “Hard Collision,” and it allowed the users to enhance the simulation with an extra layer of connection between the virtual and real world, since the users could no longer simply move their hands through objects within the virtual environment; They had to be more careful about where they were moving their hands and be more mindful of the objects the hands would be colliding with.

Other challenges with this feature was that I needed to ensure the user could see both where the hand hit the in-game objects, and where the hand currently is in real life. It also had to be possible to turn this feature on and off.

  • To make controllers interact physically with objects, a copy of the controller object was created. This new controller object always followed where the user’s hands are positioned in real life. These controller objects were then given colliders and a rigidbody, which meant that the controller objects would collide with and be unable to move past other objects present within the scene.
  • To allow the user to be able to see the real position of their hands in relation to the position of their in-game hands, I created a shader that would make an object visible through other objects. This shader was attached to the controller that tracks the real hand position. Whenever the player collided with an object in-game, the in-game representation of their hands was frozen and this shader was toggled, allowing the player to see the real position of their hands through the object or wall while also seeing where they had collided with an object in the game.
  • The Hard Collision was made toggleable at runtime. Toggling the Hard Collision would switch a different set of controllers off and on, and it would remove the special shader that makes the controllers at the real hand positions visible through objects. The target positions for the in-game model’s hands were also changed depending on which mode was toggled; If the Hard Collision was off, then the model’s hands would follow the location of the real-world controllers, while if the Hard Collision was on, the model’s hands would follow the location of the new controller object which constantly tried to follow the position of the real-world controllers while having collision active.

Another primary contribution was to project VALIXR, which utilized the MRET Toolkit to provide a VR and AR visualization of the NASA Goddard Earth Observing System (GEOS) climate and weather model. VALIXR is an open-source XR front-end that leverages MRET to provide an immersive, intuitive environment for visualizing and interacting with the GEOS model’s output. It adds a “Holographic Table” visualization into MRET for the layering of 2D and 3D data. 3D data includes using point clouds, volumetric rendering, isosurfaces, and procedural meshes.

For this project, I led discussions with weather scientists and interpreted what the scientists needed from the software, then designed and implemented features to allow them to view and form connections between critical data sets.

  • I implemented the primary data sources panel, which allowed users to toggle on and off various layers which visualized different data sets.
  • I created the primary controls for the “Holographic Earth” pedestal, while also prototyping a variety of other in-game panels and panel controls using Figma.

Leave a comment