Unity developers will soon be able to apply haptics technology to Unity VR projects.
Unity is one of the most popular videogame engines in the world, forming the backbone of a staggering number of videogames, mobile apps and even virtual reality (VR) experiences. As the demand for further immersion in VR continues to rise, there is a corresponding need to integrate sensory technologies like haptics into engines such as Unity, which is what Ultrahaptics is supplying with its latest release.
Ultrahaptics uses sound-based technology to let users feel interactions with virtual objects using their bare hands, with no gloves or other bulky sensor apparatus required. The technology creates points of pressure in mid-air that can be shaped to create the feeling of 3D shapes.
The company is launching the Ultrahaptics Core Asset, or UCA, which is a new haptics plugin which is designed to let developers use the Ultrahaptics technology when creating new VR and AR experiences.
The UCA is currently in closed beta, with plans to released into open beta early in Q3 of 2018. This has been created to allow designers and developers to add haptics into experiences created with Unity. The plugin will come with a range of pre-developed sensations that can be droppedinto Unity projects. There will also be a visualisation tool to show developers where haptics have been applied to digital objects and what sensations will be applied. A scripting API is also included to allow developers to craft their own haptic sensations.
Richard Hayden, Ultrahaptics’ Digital Product Lead, said, “We’re really excited about this tool. Now Unity developers can incorporate mid-air haptics in a way that drops seamlessly into their established workflow. You no longer need to be an expert to bring Ultrahaptics technology into your experiences. I can’t wait to see what the creative minds of the Unity developer community come up with!”