Americas

  • United States

Asia

mike_elgan
Contributing Columnist

2024 will be ‘The year of AI glasses’

opinion
Jan 05, 20246 mins
Artificial IntelligenceAugmented RealityVirtual Reality

Hardly anybody talks about AI glasses, but by the end of the year, many users will be wearing them every day.

Ray-Ban Meta glasses
Credit: Meta

Apple’s pricey Vision Pro augmented reality platform is expected to arrive in the first quarter of 2024. But by the end of the year, I predict the platform of the year will be — drum roll, please — AI glasses!

Wait, what?

That’s right. Glasses that let you interact with artificial intelligence (AI) from the comfort of your own face will be the sleeper hit of the year. In fact, the buzz around market leader Meta has already begun.

Announced in September and shipped in October, the Ray-Ban Meta glasses arrival was initially received with a collective shrug. They were assumed to be camera glasses, like Snap Spectacles, or virtual assistant glasses, like Amazon’s Echo Frames. Or, for that matter, a small upgrade from their Meta predecessor, Ray-Ban Stories. But Spectacles, Echo Frames and Ray-Ban Stories failed to thrill the gadget-loving public.

It took a while for everyone to learn that Ray-Ban Meta glasses, which start at $299, are orders of magnitude better and more powerful than any of these lackluster gadgets; they offer a much better camera, super high-quality audio, the ability to live-stream to social, and an incredibly good AI assistant.

(Here’s a look at the camera quality via my own photos and a video.)

Despite the lackluster launch, Ray-Ban Meta glasses started blowing up online in December when three things happened.

First, Meta announced a kind of closed beta of its “multimodal” feature. While users could conjure up the Meta Assistant at any time using the “Hey, Meta” command, the “multimodal” feature adds a “look” command that sends a picture taken through the glasses’ camera to the Meta Assistant for processing and analysis. You can tell your glasses to look at a table full of ingredients and condiments and give you a recipe for using those items, for example — all hands-free.  The combination of both spoken and visual interaction with Meta’s powerful AI is mind-blowing and conspicuously world-changing.

Second, tech journalists started forming a consensus that Ray-Ban Meta glasses are actually transformative. Though I was praising them way back in October, a critical mass of my colleagues really started getting excited about them only last month.

Mashable’s Kimberly Gedeon said: “The Ray-Ban Meta Smart Glasses shocked me, in a good way.”

9to5Mac’s Filipe Espósito wrote, “Ray-Ban Meta glasses convinced me to believe in smart glasses.”

C|NET’s Scott Stein said (about the “multi-modal” feature) that “the demo wowed me because I had never seen anything like it.”

Third, Ray-Ban Meta videos started taking off on TikTok and other social networks (even though the glasses’ live-streaming feature supported only Facebook and Instagram).

Yes, it’s already a market

Several AI glasses products are already on the market. For example, Lucyd Lyte Smart Eyewear Powered with ChatGPT was announced in August. The glasses are inexpensive, but of fatally low quality — especially the sound quality — according to reviewers.

Another product, the Solos AirGo 3, also provides access to ChatGPT from the glasses and have tended to get more positive reviews than the Lucyd Lyte products.

And, of course, Amazon recently released a third-generation update to its  Echo Frames, which have no camera but are nevertheless only $30 less expensive than Ray-Ban Meta glasses. Plus, they let you access Amazon’s Alexa assistant, which isn’t really LLM-based generative AI.

Other AI wearables have and will hit the market. Already, it’s pretty easy to chat with AI via earbuds. The problem with this platform is that nobody wears them all day, every day — unlike glasses, which people do, in fact, always wear except for when they’re sleeping.

Smart watches are another avenue for accessing AI. It’s possible to use an app to talk to AI through your watch and get an answer. The weak element here is that the sound quality from watches is generally too quiet for the user and too loud for others nearby.

Humane rolled out its AI Pin, which is a 55-gram magnetic device that attaches to clothing. It uses a camera and microphone for input and a speaker and laser beam for output. The boldness of a wholly new wearable platform is refreshing. But asking the public to pin something to their clothes — not possible or advisable in many circumstances — is too big an ask. The Humane AI Pin doesn’t stand a chance to succeed in the market.

The importance of the form factor

In fact, the reason all these wearables will fail to catch on as AI interface devices is that the glasses form factor is obviously superior. The temple or arm of these glasses is perfectly positioned to drop high-quality audio into the ears that sounds great to the wearer and is close to silent to others nearby. And they’re large enough to hold batteries, antennae and other electronic components.

As Ray-Ban Meta glasses prove, glasses can look stylish while still holding a high-quality camera and a light to show others when you’re capturing pictures or videos.

Of course, many people wear glasses every day already. Ray-Ban lets you order them with transition lenses (sunglasses in the sun, clear indoors) or prescription lenses or both. I know people who started wearing clear glasses indoors just because of the Ray-Ban Meta glasses. But, of course, for people who already wear glasses, opting for AI frames is a no-brainer.

Ray-Ban Meta glasses rule the AI glasses market simply because they’re of vastly higher quality than existing competitors and come at a price that must be subsidized by Meta. In fact, dozens of Ray-Ban glasses without any electronics whatsoever are more expensive that Ray-Ban Meta glasses.

But, just as when Amazon came out with the Echo product, big-name competitors quickly emerged. We can expect rivals like Google, Microsoft, OpenAI and others — including possibly even Apple — to get into the AI glasses market in the next year or two.

This is the market I’m looking forward to. Meta proved that if you just make the whole experience seamless and super high-quality, AI glasses are irresistible.

Google has no option but to ship (probably Pixel-branded) smartglasses, because AI glasses will be what replaces Search for most people most of the time.

Amazon will almost certainly ship AI glasses (probably Fire-branded) after being clobbered in the market by Meta with a product that puts Echo Frames to shame. Amazon wants to collect all the data of all the people, and Amazon Fire Glasses (modeled after Ray-Ban Meta glasses) fit the bill perfectly.

And Apple will feel compelled to jump into AI glasses sooner or later with a platform of its own (probably branded iGlasses). Glasses, like smart watches, combine both tech and fashion and demand high usability. This will be an irresistible market for Apple.

My dark horse prediction is that AI glasses become OpenAI’s first hardware play. Regardless, the year 2024 is going to surprise everyone, because AI glasses will emerge as the most important gadget category in tech.