2020-09-20

You’ll be wearing true AR glasses sooner than you think

Author: Iva Filipović

PC’s marked the first great technology wave that changed our lives forever. Smartphones marked the second one and AR/VR seems very promising in creating a new wave on their own. It brings the world of human-centered computing closer and combines real and virtual to explore all the new ways in which human experiences and lives can change for the better.

If you watched the Facebook Connect Event, streamed live four days ago, it’s not a far stretch to assume there was an overwhelming amount of information presented. The focus was on AR/VR.

We’ll focus on the AR and the glasses. There were three types of glasses mentioned; the smart glasses, project Aria glasses, and the true AR glasses.

By the end of the event, we found out from Facebook’s CEO Mark Zuckerberg that the smart glasses will be released next year. Michael Abrash, Oculus Chief Scientist and a researcher at the Facebook Reality Labs estimated that the first true AR glasses are coming in 10 years or so due to technology not being ready yet.

However, we have a reason to believe that we’ll be getting a hold of these true AR glasses a lot earlier. Why? We’ll get to it. But first, let’s explore the possible accessories and features of the mentioned glasses.

EMG armband as a true AR glasses accessory

EMG is the diagnostic procedure being used in AR development for years. Emakinians have tested it out as early as 2015 by combining the Oculus headset and a Myo Armband.

EMG reads the muscle movement and the nerve cells that control them – known as motor neurons. EMG reads these signals and translates them into numbers which are then read by whatever piece of technology so that the user can easily control and interact with it. Myo Armband and armbands alike serve as mediators between the humans and the machines and the software they run on. In this case, the gadget that’ll be controlled by the armband in the future are the true AR glasses.

Facebook Reality Labs testing their armband.
Source: Facebook Connect Event.

What makes an armband superior? Brain signals are a lot easier to read from the motor neuron signals on the wrist than on the head, Abrash M. claims. This should come as no surprise considering the lack of a thick bone framework that protects the matter like in the case of the brain being protected by the skull.

The process of installing an armband is completely non-invasive, meaning it’s no more complicated to install than putting on a watch or a bracelet – unlike, for example, a piece of your skull being removed in order to install a Neuralink implant.

Demonstration of a person born with hand abnormalities being able to use a normal, five-finger virtual hand with the sheer power of their mind and the help of an armband. It took 5 minutes for the person to gain control of the virtual hand. Source: Facebook Connect Event.

What the armband therefore enables is to interact with the virtual (digital) world as if its reality and goes as far as erasing disabilities in the virtual world.

It enables having a virtual superhuman creation with superpower-like abilities such as typing without looking or having a keyboard in front of us, moving objects in the virtual world with simple gestures and without actually touching them, gaining control of a “virtual”, “sixth” finger and many more.

The ability of humans to get control of the “sixth” finger has already been demonstrated in the real world through prosthetics.

A picture containing food, person, person

Description automatically generated
The keyboard becoming obsolete when typing.
Source: Facebook Connect Event.

Enabling audio superpowers

True AR glasses will excel in placing the user in the center of computing. They are supposed to adapt to the user’s needs, learn, and make life-bettering suggestions from what’s learned, opposing the all-too-familiar story of the user having to adapt to the technology and learn how to use it.

The glasses will see the world exactly as the wearer sees it, with AI being completely personalized & contextualized, proactive rather than reactive, reliable, private, and intuitive.

Beamforming is a technique that one of the functionalities will center around. This functionality will allow any sound to go to the desired point in space (another user) regardless of the setting a person is in (eg., a loud coffee shop) or how many objects are between the two or more users.

In addition to beamforming, an HTRF function will mimic the real ear in how it receives sound according to the head movements. For example, when listening to the A and B who are on separate distances and are both talking at the same time – if your head turns to A, you’ll hear less of what B is saying. Vice versa.

It’s very worthy calling those audio superpowers, as Abrash puts it himself.

Visual superpowers as the icing on the cake

Visual superpowers will transform true AR glasses into the next big thing in tech. The glasses will enable all-time access to whiteboards, maps, screens, monitors, calendars, notes, outdoor settings, and avatars.

To make this possible, the entire world needs a complete, detailed scanning. When we think of world-scanning, we usually think of the satellites, which lack thoroughness. Car and backpack scanners only cover the outside portion of what people interact with.

This is where project Aria glasses come into place. Project Aria includes a new type of world scanner in the form of glasses that scan the world in detail. Facebook employees have already started wearing them and scanning the world in order to map it perfectly and enable all the functionalities of the AR glasses to the fullest.

Project Aria glasses.
Source: Facebook Reality Labs.

Here is an example of functionality that requires a completely scanned and mapped world. Say you want to connect with a life-size avatar of your friend that’s perfectly scaled according to the setting you’re in. It will feel more natural no matter the occasion – studying, killing the boredom, or just wanting to hang out this way because you’re too far apart, or, one of you tested Covid-19 positive. 

From having a virtual office on the go to live suggestions or route guidance – the utility possibilities are endless.

A person sitting on a table

Description automatically generated
An example of an avatar study buddy.
Source: Facebook Connect Event.

Through the entire Facebook Connect Event, Michael Abrash stresses out that everything is still in the research and could take years before its available to the wider public. However, the only thing that seems to miss to complete the true AR glasses dream is the map of a fully scanned world.

The technology and display are there

We’re speculating that the reason behind Abrash and Zuckerberg mentioning the display and all the functionalities not being ready yet lies in the next year’s release, smart glasses,  only having the before mentioned “audio superpowers”. Those won’t make them a lot different from what’s already on the smart glasses market but will make them worth an upgrade. A good example are the Vue glasses which we’ve also tested at Emakina.

In March 2020, Facebook and British microLED manufacturer Plessey Semiconductors Ltd have entered into an exclusive arrangement to help further Facebook’s AR plans. The gif below demonstrates what Plessey has been working on. It looks much like what we’re going to see in the true AR glasses once they’re released.

The technology is there. The display is there.

The only thing separating us from the future is the time we’ll need to spend waiting for the Aria project glasses to finish getting the world completely scanned.

And we can’t wait for the moment it’s done.

gallery image