{"id":302,"date":"2020-01-29T07:06:10","date_gmt":"2020-01-29T07:06:10","guid":{"rendered":"http:\/\/www.icomputinglabs.in\/blog\/?p=302"},"modified":"2023-08-03T10:15:48","modified_gmt":"2023-08-03T10:15:48","slug":"how-ai-is-shaping-the-next-generation-xr-top-4-use-cases","status":"publish","type":"post","link":"https:\/\/www.icomputinglabs.in\/blog\/how-ai-is-shaping-the-next-generation-xr-top-4-use-cases\/","title":{"rendered":"How AI is shaping the next generation XR: top 4 use cases"},"content":{"rendered":"\n<p>Solutions which allow humans to explore fully immersive computer-generated worlds (in VR), and overlay computer graphics onto our view of our immediate environment (AR) are both increasingly being adopted in both education and enterprise.<br><\/p>\n\n\n\n<p>With rampant advances in deep learning technologies (AI), access to large volumes of data, massive innovations in AR\/VR devices and software, we are in a unique position today, to leverage them for a plethora of use cases that are addressing real-life scenarios positively. This includes customer service automation, employee productivity, training, visualization and remote troubleshooting, inspection and documentation.<br><\/p>\n\n\n\n<p>In this blog, we discuss four major use cases which are powered by AI in XR applications:&nbsp;<\/p>\n\n\n\n<ol class=\"wp-block-list\"><li>Hand Tracking on Oculus Quest and Hololens&nbsp;<\/li><li>Oculus Insight: Powered by AI<\/li><li>Real-Time AR Self-Expression with Machine Learning<\/li><li>Blending Realities with the ARCore Depth API<\/li><\/ol>\n\n\n\n<p><\/p>\n\n\n\n<ol class=\"wp-block-list\"><li><strong>Hand Tracking on Oculus Quest and Hololens&nbsp;<\/strong><\/li><\/ol>\n\n\n\n<p>Precise hand-tracking unlocks a range of new experiences as well as reduces friction for current experiences on XR. People could be able to pause a movie in VR with just a gesture, for example, and express themselves more naturally in social games. In enterprise applications, an instructor could lead a VR-based training class without having to maintain a fleet of paired, charged controllers.<\/p>\n\n\n\n<p>More broadly, hand-tracking will make VR feel more natural and intuitive, and help developers create new ways for people to interact in virtual worlds.&nbsp;<\/p>\n\n\n\n<p>How it works: <a href=\"https:\/\/ai.googleblog.com\/2019\/08\/on-device-real-time-hand-tracking-with.html\">https:\/\/ai.googleblog.com\/2019\/08\/on-device-real-time-hand-tracking-with.html<\/a><\/p>\n\n\n\n<p>Demo &#8211; Quest: <\/p>\n\n\n\n<figure class=\"wp-block-embed-youtube wp-block-embed is-type-video is-provider-youtube wp-embed-aspect-16-9 wp-has-aspect-ratio\"><div class=\"wp-block-embed__wrapper\">\n<iframe loading=\"lazy\" title=\"Hand Tracking on Oculus Quest  |  Oculus Connect 6\" width=\"770\" height=\"433\" src=\"https:\/\/www.youtube.com\/embed\/2VkO-Kc3vks?feature=oembed\" frameborder=\"0\" allow=\"accelerometer; autoplay; encrypted-media; gyroscope; picture-in-picture\" allowfullscreen><\/iframe>\n<\/div><\/figure>\n\n\n\n<p><br>Demo &#8211; Hololens: <\/p>\n\n\n\n<figure class=\"wp-block-embed-youtube wp-block-embed is-type-video is-provider-youtube wp-embed-aspect-16-9 wp-has-aspect-ratio\"><div class=\"wp-block-embed__wrapper\">\n<iframe loading=\"lazy\" title=\"MRTK - Mixed Reality Toolkit&#039;s Hand Interaction Examples with HoloLens 2\" width=\"770\" height=\"433\" src=\"https:\/\/www.youtube.com\/embed\/wogJv5v9x-s?feature=oembed\" frameborder=\"0\" allow=\"accelerometer; autoplay; encrypted-media; gyroscope; picture-in-picture\" allowfullscreen><\/iframe>\n<\/div><\/figure>\n\n\n\n<p><\/p>\n\n\n\n<p>2. <strong>Oculus Insight: Powered by AI&nbsp;<\/strong><\/p>\n\n\n\n<p>The Oculus Insight system uses a custom hardware architecture and advanced computer vision algorithms \u2014 including visual-inertial mapping, place recognition, and geometry reconstruction \u2014 to establish the location of objects in relation to other objects within a given space. This novel algorithm stack enables a VR device to pinpoint its location, identify aspects of room geometry (such as floor location), and track the positions of the headset and controllers with respect to a 3D map that is generated and constantly updated by Insight.&nbsp;<\/p>\n\n\n\n<p>How it works: <a href=\"https:\/\/ai.facebook.com\/blog\/powered-by-ai-oculus-insight\/\">https:\/\/ai.facebook.com\/blog\/powered-by-ai-oculus-insight\/<\/a><br>Demo &#8211; Quest: <\/p>\n\n\n\n<figure class=\"wp-block-embed-youtube wp-block-embed is-type-video is-provider-youtube wp-embed-aspect-16-9 wp-has-aspect-ratio\"><div class=\"wp-block-embed__wrapper\">\n<iframe loading=\"lazy\" title=\"Oculus Insight VR Positional Tracking System (Sep 2018)\" width=\"770\" height=\"433\" src=\"https:\/\/www.youtube.com\/embed\/2jY3B_F3GZk?feature=oembed\" frameborder=\"0\" allow=\"accelerometer; autoplay; encrypted-media; gyroscope; picture-in-picture\" allowfullscreen><\/iframe>\n<\/div><\/figure>\n\n\n\n<p><\/p>\n\n\n\n<p>3. <strong>Real-Time AR Self-Expression with Machine Learning<\/strong><\/p>\n\n\n\n<p>One of the key challenges in making self-expression AR features possible is proper anchoring of the virtual content to the real world; a process that requires a unique set of perceptive technologies able to track the highly dynamic surface geometry across every smile, frown or smirk.<\/p>\n\n\n\n<p>To make all this possible, ARCore employs machine learning (ML) to infer approximate 3D surface geometry to enable visual effects, requiring only a single camera input without the need for a dedicated depth sensor. This approach provides the use of AR effects at real-time speeds, using TensorFlow Lite for mobile CPU inference or its new mobile GPU functionality where available. This technology is the same as what powers YouTube Stories&#8217; new creator effects, and is also available to the broader developer community via the latest ARCore SDK release and the ML Kit Face Contour Detection API.<\/p>\n\n\n\n<p>How it works: <a href=\"https:\/\/ai.googleblog.com\/2019\/03\/real-time-ar-self-expression-with.html\">https:\/\/ai.googleblog.com\/2019\/03\/real-time-ar-self-expression-with.html<\/a>  <br> Demo &#8211; ARCore:&nbsp; <\/p>\n\n\n\n<figure class=\"wp-block-image size-large\"><img decoding=\"async\" src=\"https:\/\/3.bp.blogspot.com\/-Cru7Rd3_BHc\/XIGZ_jmIraI\/AAAAAAAAD3M\/c77DxgBy1igVFdbbD-p74OJWnSaMhxHRACLcBGAs\/s1600\/image2.gif\" alt=\"\"\/><\/figure>\n\n\n\n<p><\/p>\n\n\n\n<p>4. <strong>Blending Realities with the ARCore Depth API<\/strong><\/p>\n\n\n\n<p>The ARCore Depth API allows developers to use its depth-from-motion algorithms to create a depth map using a single RGB camera. The depth map is created by taking multiple images from different angles and comparing them as you move your phone to estimate the distance to every pixel.<\/p>\n\n\n\n<p>One important application for depth is occlusion: the ability for digital objects to accurately appear in front of or behind real world objects. Occlusion helps digital objects feel as if they are actually in your space by blending them with the scene.<\/p>\n\n\n\n<p>How it works: <a href=\"https:\/\/developers.googleblog.com\/2019\/12\/blending-realities-with-arcore-depth-api.html\">https:\/\/developers.googleblog.com\/2019\/12\/blending-realities-with-arcore-depth-api.html<\/a><br>Demo &#8211; ARCore:   <\/p>\n\n\n\n<figure class=\"wp-block-embed-youtube wp-block-embed is-type-video is-provider-youtube wp-embed-aspect-16-9 wp-has-aspect-ratio\"><div class=\"wp-block-embed__wrapper\">\n<iframe loading=\"lazy\" title=\"Blending realities with the ARCore Depth API - Deep Dive\" width=\"770\" height=\"433\" src=\"https:\/\/www.youtube.com\/embed\/VOVhCTb-1io?feature=oembed\" frameborder=\"0\" allow=\"accelerometer; autoplay; encrypted-media; gyroscope; picture-in-picture\" allowfullscreen><\/iframe>\n<\/div><\/figure>\n\n\n\n<p><\/p>\n\n\n\n<p><\/p>\n","protected":false},"excerpt":{"rendered":"<p>Solutions which allow humans to explore fully immersive computer-generated worlds (in VR), and overlay computer graphics onto our view of our immediate environment (AR) are both increasingly being adopted in both education and enterprise. With rampant advances in deep learning technologies (AI), access to large volumes of data, massive innovations in AR\/VR devices and software, [&hellip;]<\/p>\n","protected":false},"author":2,"featured_media":348,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[30,37,7,39],"tags":[58,60,61,59,20],"class_list":["post-302","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-artificial-intelligence","category-augmented-reality","category-virtual-reality","category-xr","tag-ai","tag-education","tag-enterprise","tag-solutions","tag-use-cases","grid-item","grid-item-landscape"],"_links":{"self":[{"href":"https:\/\/www.icomputinglabs.in\/blog\/wp-json\/wp\/v2\/posts\/302","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.icomputinglabs.in\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.icomputinglabs.in\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.icomputinglabs.in\/blog\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/www.icomputinglabs.in\/blog\/wp-json\/wp\/v2\/comments?post=302"}],"version-history":[{"count":7,"href":"https:\/\/www.icomputinglabs.in\/blog\/wp-json\/wp\/v2\/posts\/302\/revisions"}],"predecessor-version":[{"id":349,"href":"https:\/\/www.icomputinglabs.in\/blog\/wp-json\/wp\/v2\/posts\/302\/revisions\/349"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.icomputinglabs.in\/blog\/wp-json\/wp\/v2\/media\/348"}],"wp:attachment":[{"href":"https:\/\/www.icomputinglabs.in\/blog\/wp-json\/wp\/v2\/media?parent=302"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.icomputinglabs.in\/blog\/wp-json\/wp\/v2\/categories?post=302"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.icomputinglabs.in\/blog\/wp-json\/wp\/v2\/tags?post=302"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}