Assistant Professor Roy Magnuson demonstrates his new virtual reality music composition software, solsticeVR.
What happens when a music professor has both the customary background of a musician and composer and a burgeoning aptitude for virtual reality (VR) software design? Well, you get more music, of course, but in an unexpected way.
Roy Magnuson ’05, a trombonist and composer by training, is an assistant professor of music theory and composition in the School of Music. He is a traditionally trained acoustic musician, who has composed for orchestra, wind ensembles, concert band, chamber ensembles, vocalists, electroacoustic ensembles, and films. He teaches music theory, aural skills, and composition to mostly music majors at the undergraduate level.
Now he can add VR software developer to his resume. Magnuson has created solsticeVR software, a teaching and creative tool that allows anyone, regardless of their level of training, the opportunity to compose music.
“VR is a tool, not a gimmick,” Magnuson said. “We are witnessing the inception. We’re entering a new computing age, and just how profoundly mixed reality will be—with a convergence of virtual and augmented reality—we can’t comprehend. I want to figure out how to own that for art and for creatives to do good.”
The software was inspired, in part, by his love of video games.
“Video games are meditational for me, a response loop for me,” he said. “I used gaming to relax and escape from the pressures of writing a 100-page dissertation. I was overwhelmed as a thinker. Video games were a cleanser.”
All that gaming experience proved to be more than escapism. It taught him lessons he uses today in his VR work.
“With video games I learned a lot about music, story, structure, and rules,” he said. “I learned how you get a response from someone. You lay bread crumbs. You set up expectations or not. Some of the best writers are video gamers.”
Magnuson completed the latest version of solsticeVR in 2018 after making it more user-friendly. The name comes from a series of acoustic compositions he wrote in graduate school called “solstice pieces.”
His goal had been to write “short, simple, quiet, motionless pieces around the two solstices in December and June.”
“These are the moments in time that, at least to me, everything seems to suspend—especially December, when our academic lives have just calmed down,” Magnuson said. “SolsticeVR is a space where you just exist suspended in time, in another place, creating things. The parallel seemed strong to me.”
The best way to understand what Magnuson has created with solsticeVR is to watch him do a live demonstration. At a Research Colloquium Faculty Lecture last fall at University Galleries, Magnuson described the software as “a composition tool that uses the tool that is VR.” On the VR screen, viewers saw two disembodied hands moving through a dark space filled with virtual boards, with labels such as “Mic Effects” and “Ordered Sound Objects,” where the user could press numbered buttons or move lettered levels to elicit different sounds for the composition. Magnuson had chosen “cave” mode, though other viewing options were available, and users can even add rain or snow to the scene.
Sound is a crucial VR component, so Magnuson’s sounds are played through an audio engine with accurate, 3D positional audio. To demonstrate the sound quality he strapped on his Oculus Rift S VR headset and picked up a large block in VR and tossed it to create the sound of the block moving away from him. The change in the sound—the perceived volume of the block going from near to far—is known as attenuation. There’s also the Doppler effect in VR, an example of which is an approaching train whose pitch changes as the train gets closer, and then changes again as it passes by.
To create music a user wears the headset and can add custom audio samples. Users can push the audio away or pull it toward themselves to create a mix with varied pitch levels, volume, distortion, echo, among other sound qualities. Users can record the mix to a hard drive and save it to a file, which becomes their own 3D audio musical composition conceived and produced in VR.
Magnuson improved the current iteration of solsticeVR by incorporating hand tracking with Leap Motion, which detects hand and finger motions so that the user can control their actions in VR. It’s akin to using a mouse on a PC.
“With hand tracking you just say, ‘OK, look to your right and touch that button with your hand,’” Magnuson said.
Making his way through a VR scene during his demo, the teacher in Magnuson was apparent. To begin, he offered a favorite quote from the late astronomer Carl Sagan: “If you wish to make an apple pie, you must first invent the universe.”
It’s a reference to his own journey in creating solsticeVR. Coding was not part of his training, and his learning curve remains ongoing. In high school, Magnuson played trombone in local bars in the Quad Cities and for his high school’s band. He received a bachelor’s degree in theory/composition from Illinois State University, a master’s degree in composition from Ithaca College, and a doctoral degree in music from the University of Illinois.
In a short span, he’s become a composer who uses artificial intelligence to create VR spaces. His initial challenge, Magnuson said with amusement, was that he wasn’t sure what search terms to use when Googling how to learn coding.
Magnuson took online courses and certification classes from Unity Technologies, a video game software development company. Magnuson’s initial plan was to write for VR and to create soundtracks for VR spaces. He wanted to create coloring books in VR with painting and sounds.
In a fast 18 months he had “three or four fully formed failed experiments,” a testament to being “pretty obsessed with it,” he said laughing.
Roy Magnuson’s compositions have been performed throughout the United States and Europe. In 2018 he received an Outstanding University Creative Activity Award.
“Learning to make game environments is hard,” Magnuson said. “Learning to be OK with failure is also hard. Time is the resource, and wasting it is painful.”
Eventually, the current concept emerged. He described solsticeVR as a powerful tool that can be entertaining and encourages the freedom to experiment.
“You’re not competing with or replacing reality,” he said. “The user input is flexible. It’s performative, but it can solve problems.”
Magnuson’s project has received grants from the Wonsook Kim College of Fine Arts, the College of Applied Science and Technology, and the University. His latest grant, for $3,200, will help fund a professional grade headset.
“I am so thankful for the support of the University, the College of Fine Arts, and the School of Music,” Magnuson said. “Technology is such a gaseous thing to invest in and can become a black hole for funding. I am so fortunate that they took the chance and happy too that it (the VR work) is starting to become something!”
The software is now available through the website SolsticeVR.net. Visitors can watch demonstrations of the software there as well. Magnuson has presented solsticeVR at Electronic Music Midwest and the MIT Center for Advanced Virtuality.
Magnuson called this an inflection point for the University and a time for it to be a powerful leader in instruction and art. For his part, he wants to be prepared for what students will expect in the future. He knows eventually there will be students who arrive at Illinois State expecting a certain level of technology.
“Kids have Oculus headsets at home, and they’ll want to know why we don’t have it here in the College of Fine Arts,” he said.
He currently has seven Illinois State students using his software. He plans to continually be on the frontline teaching future students about VR sound, design, lighting, textures, materials, etc.
“It’ll be cool to be part of,” he said.