How AI is shaping the next generation XR: top 4 use cases

Solutions which allow humans to explore fully immersive computer-generated worlds (in VR), and overlay computer graphics onto our view of our immediate environment (AR) are both increasingly being adopted in both education and enterprise.

With rampant advances in deep learning technologies (AI), access to large volumes of data, massive innovations in AR/VR devices and software, we are in a unique position today, to leverage them for a plethora of use cases that are addressing real-life scenarios positively. This includes customer service automation, employee productivity, training, visualization and remote troubleshooting, inspection and documentation.

In this blog, we discuss four major use cases which are powered by AI in XR applications: 

  1. Hand Tracking on Oculus Quest and Hololens 
  2. Oculus Insight: Powered by AI
  3. Real-Time AR Self-Expression with Machine Learning
  4. Blending Realities with the ARCore Depth API

  1. Hand Tracking on Oculus Quest and Hololens 

Precise hand-tracking unlocks a range of new experiences as well as reduces friction for current experiences on XR. People could be able to pause a movie in VR with just a gesture, for example, and express themselves more naturally in social games. In enterprise applications, an instructor could lead a VR-based training class without having to maintain a fleet of paired, charged controllers.

More broadly, hand-tracking will make VR feel more natural and intuitive, and help developers create new ways for people to interact in virtual worlds. 

How it works: https://ai.googleblog.com/2019/08/on-device-real-time-hand-tracking-with.html

Demo – Quest:


Demo – Hololens:

2. Oculus Insight: Powered by AI 

The Oculus Insight system uses a custom hardware architecture and advanced computer vision algorithms — including visual-inertial mapping, place recognition, and geometry reconstruction — to establish the location of objects in relation to other objects within a given space. This novel algorithm stack enables a VR device to pinpoint its location, identify aspects of room geometry (such as floor location), and track the positions of the headset and controllers with respect to a 3D map that is generated and constantly updated by Insight. 

How it works: https://ai.facebook.com/blog/powered-by-ai-oculus-insight/
Demo – Quest:

3. Real-Time AR Self-Expression with Machine Learning

One of the key challenges in making self-expression AR features possible is proper anchoring of the virtual content to the real world; a process that requires a unique set of perceptive technologies able to track the highly dynamic surface geometry across every smile, frown or smirk.

To make all this possible, ARCore employs machine learning (ML) to infer approximate 3D surface geometry to enable visual effects, requiring only a single camera input without the need for a dedicated depth sensor. This approach provides the use of AR effects at real-time speeds, using TensorFlow Lite for mobile CPU inference or its new mobile GPU functionality where available. This technology is the same as what powers YouTube Stories’ new creator effects, and is also available to the broader developer community via the latest ARCore SDK release and the ML Kit Face Contour Detection API.

How it works: https://ai.googleblog.com/2019/03/real-time-ar-self-expression-with.html
Demo – ARCore: 

4. Blending Realities with the ARCore Depth API

The ARCore Depth API allows developers to use its depth-from-motion algorithms to create a depth map using a single RGB camera. The depth map is created by taking multiple images from different angles and comparing them as you move your phone to estimate the distance to every pixel.

One important application for depth is occlusion: the ability for digital objects to accurately appear in front of or behind real world objects. Occlusion helps digital objects feel as if they are actually in your space by blending them with the scene.

How it works: https://developers.googleblog.com/2019/12/blending-realities-with-arcore-depth-api.html
Demo – ARCore: