Updated: Feb 16, 2022
The tablet smartphone has been with us for almost 15 years. It has been an incredibly successful paradigm for communications and mobile computing, spawning an expansive support industry of APP developers and arguably was the driving force behind the rise of cloud computing.
Every dog has its day, and all technologies face disruption from new technologies. In 2022 we’ll be hearing a great deal about augmented and extended reality glass as the up-and-coming replacement for our smartphone, and a segway into the METAVERSE.
Augmented Reality is an interface where media, information, and interfaces are prismatically projected in our field of view from a set of AR-equipped eyeglasses. We interact with voice, eye movement, eyelid movement (blick sequences), and gestures (like taping our fingers on a 3D projected keyboard).
Believe it or not, Vuzix and other early pioneers in augmented reality glasses launched several years before the first iPhone. While conceptually compelling, every aspect of technology was too immature and rudimentary to be practical. In 2011 Google introduced its first generation of google glasses available to a limited set of developers - you remember those lucky few “Glassholes.” That being said, Google was on to something. Despite their limitations, the glasses were stylish in their futuristic way, and more importantly, they had the basic functions we would expect in all future AR glasses:
Prism based image display overlying the direct line of sight (limited cover single eye prism for google glass)
- Advanced AR glasses will have a display for each eye and extend across the entire field of view
- Advanced AR glasses will have two cameras for stereoscopic vision to measure the depth of objects
Bone conductive microphone and speaker – voice recognition and commands
- Google Glasses uses a touch bar for input
- Advanced AR glasses eliminate this “touch” feature
Advanced AR glasses add the following:
- Hand and finger gesture tracing (e.g., for typing on a virtual keyboard projection)
- Inward point cameras to track the user’s pupil and vector of vision - AN Incredibly important feature for immersive AR experience – what is the user looking at!
For the last decade, Microsoft, with its holo lense program, and Magic Leap, developed the components and algorithms to realize the vision of a fully immersive augmented reality interface. So far, however, the form factor and price point are a non-starter for any consumer applications. In the background, working in “skunkworks mode,” the world’s largest cell phone makers, Apple, Samsung, and Huawei AR glass projects are coming to maturity. Have they solved the size, weight, and battery life issues inherent in the Microsoft and Magic Leap glasses?
I predict that we’ll see prototypes this year and working products from 2024 to 2025.
I also predict three unintended consequences from AR adoption supplanting the smartphone. First, the “always-on capture” of the environment inherent in AR will accelerate the budding dystopian Survalience State that smartphone cameras and 4G/5G connectivity incubated. “I have seen Big Brother, and he is Us!” My second prediction: AR image processing in the cloud will dominate 5G and follow-on 6G cellular data traffic. AR will be a boon to cellular operators. The third and final unintended consequence will be the rise of advertisement-free curated information and image classification paid by the end-user subscription. – the death of the free ad-driven internet. (hint – this is a great time to start a company providing curated AR data – the danger, Apple, Microsoft, and others may beat you to it) Why? Can you imagine endless streams of ads popping into your field of vision all the time? You will ask for the information you want, when you want it, and how you want it displayed. Fortunately, self-driving cars are coming, so pop-up ads in AR won’t cause accidents!
As a final note, what about “bionic” AR contacts as opposed to glasses? Research is steadily making working toward a practical working prototype. Power will be provided wirelessly, as will processed images for display. Flexible OLED display material can bend and fit the convex shape of the lens but currently lacks pixel density to match the image quality of AR glasses. An alternate CMOS imaging display technology is also under investigation (e.g., Mojo Vision). A 9-axis motion sensor will provide viewpoint tracking. However, the motion tracking accuracy of current technology needs further refinement. Outward-facing camera integration still needs to be addressed without impacting the field of vision. Finally, Bluetooth earbuds, as a separate accessory, provide audio communications. We expect commercial AR contacts to follow AR glass by three to five years and may become more popular as AR contacts will be less obtrusive than glasses. Time will tell.