The year was 1994. Bill Clinton was the U.S. president; the first conference devoted to the commercial potential of the World Wide Web opened in San Francisco; the NFL announced the Jacksonville Jaguars as the league's 30th franchise; Ace of Base had the top song on the music charts … and a small band of scientists & creatives tucked in a dimly lit Supercomputing Lab in Idaho Falls, Idaho, were given this objective: “Investigate the feasibility and usefulness of simulation-based planning and design concepts for restoring and cleaning up subsurface hazardous waste sites. In particular, the feasibility of using available characterization data from waste sites to generate a virtual environment representing the site and environment to assess remediation and planning.” Specifically, and in more layman terms, develop a Virtual Reality prototype to aid effective, accurate and safe clean-up of buried hazardous waste.
Up till then, the team had been focusing on parallel-processing, computer animation and scientific visualization. However, somehow, it made sense that the natural progression, and likely one of the most compelling applications of VR, would be next in our repertoire.
Through the visionary leadership of Principle Investigators Tom Larson and Eric Greenwade, the Virtual Environment Project Team, as we became called, rose to the challenge.
While the results, learnings and breakthroughs of this work are too considerable to share in just one blog, I will share a few fun facts from “the early days”, focusing on what the technology was like, potentially offering readers a greater appreciation for how far it has come and exists today.
The Early Days
Cray YMP Supercomputer
First and foremost, we didn’t have ‘goggles’. We didn’t have plug-n-play or pre-defined 3D objects, there was no Internet Explorer, there was no recognized ‘cloud’ or AWS, there was no Nvidia GPU, and there certainly was no Unity.
When we started we didn’t even have a database. Characterization data had to be hand typed into data sets, we were using RGB codes & sequences to create colors, highly complex math algorithms had to be developed, and I didn’t know anyone who didn’t have a copy of Graphics Gems by Andrew S. Glassner on their shelf.
Incidentally, the team had evaluated the potential of using the early projection technology of the CAVE, which I believe had launched a year before, but at the time, it was too large, expensive and immobile for use.
VREAM rendering model
What we did have was a suite of CAD, animation & rendering SW packages, alpha & beta access to select open database and virtual environment source code, access to a broad network of national lab colleagues – and the processing power of the most sophisticated supercomputers in the state: a Cray YMP (available at the time for a cool $5M), an SGI Unix workstation, a cluster of Sun Sparc stations and a Sun Microsystems SparcServer 2000 with eight 50 MHz processors each with 2MB of external cache - oh yeah, we thought we hit the jackpot.
We purchased a couple pairs of CrystalEyes liquid crystal display glasses (pictured above) shuttering at a 60 Hz display rate (30 image pair cycles per second). Although this was all the Sun Monitor could handle, it was not too shabby if you consider that the Oculus Rift currently runs at a 90 Hz refresh rate. (I don’t remember the cost of the glasses– but venture to say about $2000 each).
We downloaded the WorldToolKit Virtual World software from SENSE8 Corporation and went to work integrating the scientific visualization overlays, rendered 3D objects and waste site characterization data - some of which had been manually recorded more than two decades earlier, in pencil.
In addition to the challenges mentioned above, the Virtual Environment Project Team of 1994 had some of the same challenges teams face today. Including human factors considerations, eye-tracking, seamless rendering transitions, rendering time and realistic object representation.
World Tool Kit Virtual World rendering
But what culminated was a virtual environment incorporating components that are now widely recognized as the fundamental tech stack of any VR solution: i.e. database, virtual world authoring system, rendering environment, app controller layer and presentation layer.
It was also a very exciting, interactive virtual environment that rendered worlds ‘on the fly’ and maintained information on the dimensions, depth and contents of the buried objects. All of which could be revealed in the virtual world with a click of the mouse.
Virtual Environment with scientific visualization overlay
Most of the visuals from this work have long since been archived. However, these screen shots highlight some of the early renderings.
And so it was, that after months of research, collaboration, and programming, the team successfully developed and presented the Virtual Environment Applications for Buried Waste Characterization Technology Evaluation in May of 1995 to the Idaho National Engineering Laboratory.
This use case of VR for feasibility in complex situations was just the beginning in a long line of VR applications at the Idaho National Laboratory, a series we are looking forward to sharing with you more throughout the year, and continues to showcase the innovative spirit of Idaho and all of our teams pioneering in mixed reality today.