The update has added support for Apple’s ARKit 2.0 and Google ARCore 1.2.
After its usual process of releasing several previews, Epic Games has today launched the next version of its popular videogame engine, Unreal Engine 4.20, making it easier and more seamless for developers to build realistic characters and immersive environments across videogames, film and TV, virtual reality (VR), augmented reality (AR) and mixed reality (MR) and enterprise applications.
Unreal Engine 4.20 features hundreds of optimisations, especially for iOS and Android, as well as Magic Leap One. Having previously announced Early Access support for Magic Leap One: Creator Edition during the Game Developers Conference (GDC) 2018 as part of a larger partnership between the two companies, Unreal Engine 4.20 fully supports development for the AR headset.
Continuing with the AR support, Unreal Engine 4.20 has added support for Apple’s ARKit 2.0 and Google ARCore 1.2. For ARKit 2.0 this includes better tracking quality, support for vertical plane detection, face tracking, 2D image detection, 3D object detection, persistent AR experiences and shared AR experiences. While for ARCore 1.2 the engine includes support for vertical plane detection, Augmented Images, and Cloud Anchors.
- New: Added a label to the VR spectator screen render call so it is distinguishable in profiling tools, such as RenderDoc.
- New: Added the ability for Motion Controller components to track/display HMD devices by using “HMD” as the source name.
- New: Added experimental support for lens calibration on MR projects using OpenCV.
- New: Updated SteamVR SDK to version 1.0.11, this includes improved Vulkan support.
- New: Improved SteamVR compositor timing and overall engine performance through more accurate CPU utilization data.
- New: Added Render Bridge base class, FXRRenderBridge, to reduce code duplication when creating an XR Plugin with a Custom Present implementation.
- New: Added initial support for omni-directional stereo captures.
- New: Made Oculus code-scheduling functions available for use by other XR Plugins through the “Head Mounted Display” module.
- New: Added virtual curves for head rotation from FaceAR’s face tracking LiveLink streaming.
- New: Added support for instanced stereo translucent rendering.
- New: Added support for enabling the ARKit 1.5 auto-focus setting.
- New: Added checks for ARKit 1.0 availability when creating the AR session, preventing calls to invalid selectors on older iOS devices.
- New: Added functions for checking ARKit version availability at runtime.
- New: Refactored how ARKit supports #define, simplifying wrapping individual features by ARKit version.
- New: Added a console command to change where Face AR is publishing LiveLink curve data. “LiveLinkFaceAR SendTo=192.168.1.1”.
- New: Wrapped vertical plane detection in an “if iOS 11.3” check, since ARKit 1.5 is only available in 11.3 and later.
- New: Added orientation to the “AR Candidate Image” object for passing to the detection system.
- New: Added support for handling “AR Image Anchor” notifications from ARKit.
- New: Added a friendly name to UARCandidateImage objects.
- New: Added base types for detecting images in an AR session.
- New: Added the name from the candidate image when creating the Apple side representation.
- New: Added support for configuring which images to detect during an AR session.
- New: Improved debug layer rendering on Oculus, circumventing a superfluous blit by rendering directly to the layer.
- New: Added models to the Oculus plugin for HMD and Sensors.
- New: Added “Use Camera Rotation” feature to “Stereo Panorama” Plugin. Now this plugin can take the current camera’s rotation. To enable this feature, use the console command “SP.UseCameraRotation 7”.
- New: Updated Camera Component can now tell the “Late Update Manager” to store that we don’t want to do a late update this frame. Default XR Camera checks this flag before applying the late update to the camera.
- New: Refactored PSVR Minor reprojection for Frame Counter comparison.
- New: Added a new PSVR API function for getting a transform between “floor” and “eye” tracking spaces.
- New: Added a new PSVR delegate that can detect when an app changes between “floor” and “eye” tracking space.
- Removed: Unused Stereo Rendering function “Get Custom Present” method has been removed.