I-nteract Allows Users To Touch And 3D Print MR Objects

I-nteract Allows Users To Touch And 3D Print MR Objects
November 6, 2020

Due to their general ubiquity, it may not be readily apparent just how unintuitive computers are for the process of 3D computer aided design (CAD). A mouse or trackpad along with clicks and some typing is sufficient for the mathematical side of the equation but can’t replace hands-on crafting and sculpting. For this reason and more, a team of researchers have designed I-nteract, a mixed reality system that combines CAD, 3D printing, haptics, a virtual reality (VR) headset and artificial intelligence to merge the digital and physical worlds.


Lack of tactility in CAD work is an issue that prevents untrained users from being able to easily create models. However, this is just one of many problems that the researchers were attempting to address. Additionally, the authors of “I-nteract 2.0: A Cyber-Physical System to Design 3D Models using Mixed Reality Technologies and Deep Learning for Additive Manufacturing,” published at arXiv.org, point out that the minimal feedback between 3D printing and modeling requires fabricating a part multiple times for iteration purposes, with each print used to validate or modify the design.


To overcome these issues, the team created a cyberphysical system (CPS) that allows users to interact with virtual and physical objects within a single visio-haptic mixed reality (VHMR) environment. The goal was to create an intuitive mixed reality interface to 3D scan and make a digital twin of physical objects. Users could then simulate the physical properties of design modifications from the digital twin. The authors write:


“To the best of our knowledge, I-nteract 2.0 is the first VHMR system that enables generative AI based CAD in MR for AM. Integration with CSG allows the user to design 3D models from scratch using primitive 3D objects (such as cuboids, cylinders, spheres, etc.) and his/her creative skills in a MR environment.”


The CPS consists of the following hardware components:

- HoloLens smartglasses for visual feedback

- Dexmo haptic gloves for force feedback

- VIVE3 gloves for hand tracking


While all these separate parts enable a VHMR environment, the heart is obviously the actual software itself, which allows for the modeling of objects to be 3D printed. The software relies on constructive solid geometry (CSG) and machine learning, which automates the portions of the design process that require modeling expertise.

CSG is a solid modeling technique in which complex 3D models are built via Boolean set operations and simple geometries. With I-interact, users can first select a primitive (such as a cube, sphere or cylinder) with an index finger or by voice command. In addition to basic transformations, such as changing the scale of different dimensions, users can add, subtract or intersect a new shape onto the original. This is repeated until a complete model is created out of many primitives.

3D models can also be generated automatically by taking pictures of objects with the HoloLens, with the image captured used to automatically select a best match of the reconstructed mesh from a 3D model database of printable models. The software then presents the top five options from the database for the user to choose from. Once the user is ready, the model can be printed via OctoPrint4.


Future work will include streamlining the design process for 3D printing so that non-expert designers can use the tool more easily, as well as features for monitoring the print process to improve build quality. The work was carried out by Ammar Malik, with the Department of Electrical and Electronic Engineering at University College Dublin, Hugo Lhachemi with L2S at CentraleSupelec, and Robert Shorten with the Dyson School of Design Engineering at Imperial College London. Support was provided in part by the Science Foundation Ireland under the European Regional Development Fund and by I-Form industry partners.

Related articles

VRrOOm Wechat