Meta Outlines AI, VR and AR Advancements at Connect 2024

It’s Meta Connect day, where Meta has provided a heap of updates on its latest technological advancements, including new Quest VR headsets, AR glasses, an updated Llama AI model, new gen AI options in its apps, and more.

There’s a heap to get through, and a heap to consider. Here’s an overview of all the big announcements from Meta’s latest showcase.

Orion AR Glasses

The star of the show was Meta’s first AR glasses, called “Orion”, which it’s been developing over the past five years.

Meta says that this is the “most advanced pair of AR glasses ever made”, and the result of complex efforts to compress the required components into a small enough format that they can viably be worn as regular glasses.

As you can see in the above image, the Orion glasses, as they currently stand, come with a wrist control, “for subtle, socially acceptable input”, as well as a wireless battery pack.

Meta says that Orion has “the largest field of view in the smallest AR glasses form to date”, and will enable a range of immersive experiences, from “multitasking windows and big-screen entertainment to life-size holograms of people”.

The glasses themselves are chunkier than regular sunglasses, and far more so than Meta’s own Ray Ban smart glasses. But again, Meta’s had to squeeze a lot of tech into a very small space.

So you sort of look like an eccentric fashion designer when wearing them. though you likely won’t be wearing these ones just yet.

Meta’s not releasing its AR glasses to the public at this stage, but it is giving them to selected developers and internal Meta staff for testing.

So these examples are more to show just how far the technology has developed, but they may not be exactly what Meta releases as AR glasses in the next few years.

That could mean that the actual consumer version looks even less solid, and more stylish, via Meta’s partnership with Ray Ban maker EssilorLuxottica.

But essentially, the main crux of Meta’s showcase today was to underline that neither Apple, with its VisionPro, or Snap, with its latest AR glasses, are beating it on this front.

Meta’s AR glasses will be more capable than Snap’s, and more viable (and likely cheaper) than Apple’s version.

We’re not there yet, but Meta’s AR expansion is getting close.

New Functionality for Ray Ban Meta Glasses

Speaking of Meta’s Ray Ban smart glasses, they’re also getting upgrades, with improved voice commands, so you can hold a conversation with Meta AI (without having to continually say “Hey Meta”), and the capacity to record and send voice messages on the go.

Also:

“We’re adding the ability for your glasses to help you remember things. Next time you fly somewhere, you don’t have to sweat forgetting where you parked at the airport – your glasses can remember your spot in long-term parking for you.”

That could be particularly handy, while Meta’s also adding a new translations feature, which will listen to the language being spoken around you, and translate it into English in via the open-ear speakers.

Now you’ll be able to confirm whether people are talking trash about you in another language, or if you’re just paranoid.

Quest 3S

Meta’s also announced its latest Quest VR headset, with the 3S model providing the same capabilities as the Quest 3, but at a lower price point.

“Starting at just $299.99 USD, Quest 3S is the best headset for those new to mixed reality and immersive experiences, or who might have been waiting for a low-cost upgrade from Quest and Quest 2.”

Meta says that it’s rebuilt its Horizon OS, so it now offers better support for key 2D apps “like YouTube, Facebook and Instagram”. It’s also improved the spatial audio Passthrough elements.

Essentially, it’s a juiced up version of Meta’s best VR headset, but it’s also cheaper, which is key to maximizing adoption.

Indeed, Meta’s also dropping the price of the 512GB Meta Quest 3 unit by $150 (to $499.99 USD), which will ideally see more people taking up its VR hardware, and expanding its user community.

The more adoption VR sees, the more momentum it gets, and while it’s still a ways from being a must-have technology, advanced systems, and new experiences, are helping to build the foundations of Meta’s VR and metaverse vision.

Celebrity Voices for Meta AI

Okay, I don’t know why Meta think s that this is a key pathway to broader AI adoption, but for some reason, Meta’s also added celebrity voices to its AI chatbot.

So the main addition is that you can now use your voice to talk to Meta AI on Messenger, Facebook, WhatsApp and Instagram DM, and it’ll respond back to you out loud.

But you can also now choose a celebrity voice for your Meta AI chatbot, including AI variations of stars like Awkwafina, Dame Judi Dench, John Cena, Keegan Michael Key and Kristen Bell.

So when you get an answer from the system, it’ll sound like a celebrity. Cool right?

No doubt this will be an interesting novelty, but I just don’t see why Meta thinks this is a valuable addition.

I mean, it’s already tried this, with celebrity-styled AI chatbots, which it eventually shut down because no one cared, while it’s also giving influencers the opportunity to create AI bots in their likeness, that respond to fans on their behalf.

I don’t see why that’s engaging, because you’re not actually communicating with these celebrities and creators, just AI variations of them. Is that what people want? Evidently, it isn’t, but Meta’s pushing ahead either way.

What’s more, Meta reportedly paid millions of dollars for the rights to use these celebrity voices.  

Yeah, I don’t think that this will be a big thing, but maybe some super fans of Dame Judi Dench will get a kick out of having a robot version of her answering their Meta AI queries.

Meta AI Image Context

Meta’s also building on its AI utility, with Meta AI now able to provide answers based on visual cues.

As you can see in this example, Meta’s AI chatbot is now better able to understand visual elements, so it can provide answers based on these elements.

The system can also now provide edits to existing images too, so you can ask for it to, say, add things into a picture.

You can also ask it to remove or change certain elements in an image, and it’ll be better able to facilitate such requests.

“And if you want to reshare a photo from feed to your Instagram Story, Meta AI’s new backgrounds feature can take a look at your photo, understand what’s in the image and generate a fun background for the story.”

AI translations

Meta’s also rolling out AI translations for Reels, so creators can reach a broader audience with their content.

The audio translations will simulate the speakers’ voice in another language, and sync their lips to match, which should make it a more authentic translation experience.

Meta says that it’s testing this with selected creators on Instagram and Facebook to begin with.

More AI

Meta’s also expanding its “Imagine” AI feature, providing more ways for people to create fantastical AI depictions of themselves within its post composer options.

It’s also adding AI generated chat themes in Messenger and IG DMs, along with recommended AI content, customized to your interests.

AI for Business

Meta will also enable more businesses to create their own AI chatbots, powered by its advanced AI models, which will be available via click-to-message ads on WhatsApp and Messenger.

“From answering common customer questions to discussing products and finalizing a purchase, these business AIs can help businesses engage with more customers and increase sales.”

That could be an easy way to maximize customer engagement, and provide immediate response and service 24/7.

Meta also says that more advertisers are adopting its generative AI ad tools, with more than 15 million ads created with them in the last month.

“On average, ad campaigns using Meta’s generative AI ad features resulted in an 11% higher click-through rate and 7.6% higher conversion rate compared to campaigns that didn’t use the features.”

More Meta advertisers have reported success with Meta’s Advantage+ campaigns, and as its AI systems continue to improve, it seems that they are driving better results.

A heap of things to consider, and Meta’s also launched a new version of its Llama language model as well, which will enable expanded development opportunities.

Many things happening at Meta HQ, which will all have varying levels of interest and impact.   

Reviews

0 %

User Score

0 ratings
Rate This

Leave your comment

Your email address will not be published. Required fields are marked *