AR – Augmented and Mixed Reality
Augmented Reality (AR) is a live direct or indirect view of a physical, real-world environment whose elements are augmented (or supplemented) by the computer-generated screen. Thanks to Pokemon Go! the whole world is now fascinated by AR, and now Apple has leapfrogged everyone to become the largest AR platform in the world.
So what can you do with it? Below are samples of how different AR platforms will be used.
Example of ARKit for American Airlines
The use case for American Airlines leverages SLAM technologies with Apple’s ARKit and Google’s ARCore. We also integrated several other location-based data points served up by American Airline’s services and IoT data points available at DFW Airport. Any wayfinding AR technologies need these data points to display the information on a screen.
There are also different ways AR leverages different technologies to create the experiences. Below is an overview of the three principal methods. AR experiences can use one or multiple methods to develop inspiring AR.
Groove Jones is an approved developer for all of the significant AR platforms – Apple ARKit, Microsoft Hololens and MetaVision Meta 2. So what are the fundamental differences? Let’s dive into the different platforms.
Apple ARKit – Mobile AR iDevices
In September of 2017, Apple will launch their Apple AR capabilities with the new iOS 11 release. This will make Apple the largest AR hardware platform in the world. They are expected to have over 500,000 iDevices (iPhones and iPads) that will be AR enabled. With this operating system update.
Industry analysts predict that by 2021 Apple will have over 3 Billion AR-enabled devices in the market.
The software uses the devices camera system to detect flat surfaces, with holograms can be anchored to. With the dramatic graphics and processing power of the iDevices, Apple has leapfrogged every other device dramatically. To see the content, you will need an iPhone or iPad running iOS 11. Below is an example.
AR Headset Hardware
Groove Jones is a licensed developer for numerous AR headsets, including the Microsoft Hololens and the MetaVision Meta 2. Both headsets have similar functionality however the Microsoft Hololens processes everything on the unit itself whereas the MetaVision Meta is connected to a PC, providing higher processing power.
Microsoft – Hololens
The Hololens has a Field of View of 30º and 1268 x 720 display. It is controlled via hand gestures and uses the headset for processing power.
MetaVision – Meta 2
The Meta 2 has a Field of View of 90º and 2560 x 1440 display. This is twice the size of the Hololens. It is controlled via hand gestures and uses a PC for it’s processing power.
Gestures are one of the three primary forms of input on HoloLens. Once you’ve targeted a hologram with gaze, gestures allow you interact with the hologram. Gesture input lets you interact with your holograms naturally using your hands or, optionally, with a clicker.
Projects Coming Soon!
This is a rapidly growing space. Check back soon for some examples of case studies that we can publicly show.
If you are interested in learning more about how to utilize AR for your digital and experiential campaigns feel free to reach out to us.