Making VR Controllers From The Ground Up

Category: 
Making VR Controllers From The Ground Up
December 30, 2016

Foreword

 

Just as a foreword, this is a project I worked on for my senior project for my final year of high school. And yes, it does look very similar to the PS Move, however the goal of the project was not to have usable controllers for VR, but rather to build a solution from the ground-up. Admittedly, the quality of the final result could have been much better, however I did accomplish my goal and I ended up learning quite a bit. It should also be noted that this project was completed in March before the Vive was released, using a DK2 on Linux.

 

Drafting

 

My initial project goal, from the beginning, was a pair of controllers for use in VR. Optimally, both hands would have their position and rotation tracked. In addition to this, I found that the most optimal input would involve at least one analog joystick for movement (since analog joysticks provide partial angles in any direction), one button per finger for binary finger input, and a few general-purpose buttons.

 

For position and rotation tracking, one possibility I investigated was using pure IMU-based tracking by integrating acceleration twice for position and rotational velocity once for rotation. After actually investigating this, however, it was quickly apparent that this would not work due to the lack of accuracy and relatively slow sampling rate even in modern IMUs, since an accurate position can only be gained by integrating accurate samples extremely quickly. Without quick and accurate enough sampling, the only way to use IMUs is via correction. For rotation, the accelerometer can tell the direction of gravity while the device is standing still, and a magnetometer can tell magnetic north. As such, rotational tracking is in a relatively accurate state with several known and tested fusion algorithms available. Position tracking, however, does not have much of a luxury for correction in terms of pure-IMU tracking.

 

My second idea ended up being similar to what Free Track does: using LEDs and a camera to determine position. Given a known distance between two LEDs, it would be possible to calculate a position. While looking into the process for this, however, I found that tracking two hands at the same time would be relatively complicated with this, in addition to it being extremely easy to block the LEDs when facing left or right relative to the camera, or if the controller itself was rotated in a direction not facing the camera. I did, however, accidentally discover a solution which would work.

 

Infrared Diffusion

 

The New 3DS’s face tracking works using a camera without an IR filter, in addition to an infrared LED next to the camera in order to light the user’s face in dark or nighttime environments. Experimenting with my DK2 camera, I was able to examine some of the patterns with this LED, while also being able to experiment with diffusing the infrared light into different objects, such as a green bouncy ball. To test this, I drafted a basic OpenCV script in Python which was able to detect my infrared-lit ball:

The reason infrared spheres are significant, however, lies in the properties of the sphere itself when projected onto a 2D surface: No matter what orientation the sphere is placed, it always projects into a circle. Given this, a position can be gained by tracking the X and Y position of the sphere for your position X and Y, and then the radius of the projected sphere can be used to determine a Z depth. With this in mind, I ordered a pack of styrofoam spheres on Amazon and some infrared LEDs to get an actual hardware solution going.

 

In the mean time, however, I still needed an actual controller, and a casing for it. To handle the IMUs, analog joysticks and buttons, I opted to use a Teensy 3.2 I bought with some other unrelated OSHPark boards I had fabricated months earlier. I also had bought two MPU-9250s for the actual motion tracking. Getting rotation from the Teensy over serial into an application ended up being relatively simple given the wealth of code and examples already open:

Once my styrofoam spheres had arrived, I began to work on actually extending my Python script into something which could actually track position. For the most part this involved doing a lot of tuning with OpenCV in order to best isolate my infrared sphere, especially in environments which may have a good amount of existing infrared light (ie incandescent lightbulbs, sunlight, etc). Once I had my sphere isolated, I spent some time figuring out my X and Y camera pixel -> real life millimeter position, and from there also figuring out the relationship between our radius to Z position, and the Z position to the X and Y position. I ended up getting some funky ratios which worked surprisingly well, however they probably aren’t *really* that accurate. In my own tests, though, it seemed accurate enough, so I went with it.

 

Demonstration and Printing

 

With position tracking said and done, I also realized that as great as showing a controller is, I needed a demo. As such I opted to modify my existing DK2 port of OpenJK to communicate with my Python daemon. To get data from my Python script to OpenJK, I ended up just using memmap for some simple one-way IPC:

With a good amount going on between my Python script and OpenJK, it was time to tie everything together: I needed to actually get my microcontroller going and an enclosure made. 3D printing the controller was a whole process in itself, since I had the foolish idea that I could totally just build a printer to print this on (since I had also planned to use it for other things). Needless to say, 3D printing and tuning a 3D printer is pretty difficult, and my initial design was awful:

The main issue was that my prints would consistently lift off the bed, resulting in an inaccurate print which could not be used. The real issue though was in the model itself:

My model was extremely curvy and frankly I have no idea what I was even thinking when I designed it, although I did design it before I actually really understood the limitations of 3D printing at this level. I also happened to be extremely new to using CAD software at all, so the problems really just ended up compounding. Eventually I decided to design a version 2, which solved all the problems of my initial design: no curves, no weird bends, KISS.

Even despite the redesign however, ABS really does not work well for large prints which are vulnerable to lifting, and ordering PLA was not an option for the time frame I had to complete this. Luckily 3DHubs is a thing, and I managed to find someone with a tuned printer to print my design while I worked on the software some more.

 

Hacking ovrd some more

 

One particular issue I wanted to work on was the fact that my infrared camera also happened to be the DK2 camera, which can be quite the problem. Basically while my script can use the camera before ovrd got to it, the consequence of ovrd not being able to access it is losing position tracking of my HMD entirely, which really isn’t all that friendly to end users. After some investigation I found that I could examine the memory of the ovrd process while it was running and actually steal its camera input instead:

Additional investigation eventually led me to the conclusion that this wouldn’t work for several reasons: ovrd sets an extremely low exposure for the camera in order to isolate only the HMD LEDs, and also to track the LED blinking patterns for accurate HMD position tracking. While I managed to reverse engineer ovrd and force more exposure, this also happened to kill the HMD position tracking, so there really wasn’t any winning. It was a neat experiment, however, and I feel like it could be useful for exposing the camera for tracking other devices, so I left the functionality in my Python script, disabled by default.

 

Once my 3D printed enclosure arrived, the real fun began. Assembling the actual inputs was a mess of wires, and I really regretted not working on this way earlier when having a board fabricated from OSHPark was actually an option.

The buttons I made for the back of the controller ended up having a few problems. For one, they were too large, and no amount of sanding could fix that easily since I lacked the tools to do it, and it was PLA. There was also the issue that the buttons could easily fall through into the chassis due to a mistake while designing it. However since my printer was tuned enough to print smaller objects, I printed some better fitted buttons and some small triangular stops to prevent my buttons from falling out, and as such my assembly continued:

With one half done, I took a small break from pain and worked some more on getting button, analog joystick, and IMU inputs into my Python script and on into my IPC memmap:

For my second half, I ended up using an HDMI cable to transfer all the signals I needed straight to the microcontroller, and with that I had my controller’s hardware 100% done:

The main issue from here, for me was an issue of duplicating all my inputs twice, and also dealing with two IMUs and two circles to track instead of one. There was also the issue that my OpenJK tracking was a bit… subpar. The weapon model moved with the controllers on all axis, but actually aiming was just a hack which also happened to map the controller X and Y to the mouse. Luckily it was enough to get a pass from my program teacher during our check-in a few weeks before our actual presentation:

 

After a fair bit of work, I managed to work out a good method of tracking which would prevent both controllers from being confused in the event of controllers crossing over the other or one controller moving outside of the view of the DK2 camera. I also managed to get OpenJK to actually use purely the rotation and position from my controller for the controlling of the actual gun in-game, and after tuning it enough to not fail on presentation day, my time effectively ran out and I honestly had no time to work on other demos, unfortunately. I would say that the product came out rather well, better than I had anticipated even:

Conclusion and Reflection

 

Making controllers for VR is hard. It really, honestly is. However, I feel like I really learned quite a bit doing this project: I used Python for a somewhat advanced application for the first time instead of just for super basic scripts. I got to use OpenCV for the first time ever, which was pretty cool. I also got to learn how 3D printers work and built my own, and I also had the opportunity to learn how to use CAD software for that.

 

I somewhat wish I had a bit more time to work on this (or money to improve on it now that I *do* have time). For one, doing tracking in a Python script definitely isn’t optimal. It was great for the time since I could do rapid development and testing, but it really, really isn’t optimal at all. My serial also was pretty awful, I really should have pushed all my data as straight binary data instead of a string, but again, time. The final product, while it does work, is pretty clunky, and could be better designed. If I had the chance to redo that, I would look into fabricating a single board so that the enclosure could be a lot smaller and more accurate. Wireless would also be nice, but I’m not 100% how handling batteries would go (the ESP32 does look like an interesting candidate for wireless and a microcontroller, although I’m not sure how latency would fare).

 

Since I do appreciate when people release their source, I’ve decided to push the final source for my project to my Github. I would have rather maybe cleaned it up some, but lack of motivation is a bit of an issue here, since I’ve been busy working on preparing for university while also working with decaf-emu on their Wii U Toolkit (WUT). In any case I feel like it might end up being useful in some aspects, and if not useful at least interesting to look at.

Related articles

VRrOOm Wechat