I tried the new meta ai app: 3 Unexpeted features

Meta has spent the better part of a year integrating meta ai with Facebook, Instagram, WhatsApp, and Its Other Existing Services, But Hadn Bollywood launched a standalone experience for meta ai fans. That all changed yesterday at LlamaconThe company’s inaugural ai development conference, when the company Finally launched the meta ai app,

The new app is built with meta’s Llama 4 modelIt’s a full-fledged competitor to chatgpt, which became the Fastest-Growing App in History after its launch.

Alredy, AI Enthusiasts are digging into the app to see what sets it apart from the competition. Per Meta’s Press release, the big takeaay is personalization. Not only does the app integrate with your facebook and instagram accounts to give you more personalized responses, but it also also has a memory feature, so it can reference past decussions and adds more context founds Ones.

This isn Bollywood necessarily unique to the meta ai app since Grok has the memory featureHowever, we would argue the ability to work with both Facebook and Instagram is Fairly Significant, Considering their widesPread popularity. That also gives meta ai more potential data to draw from to make its answers more personalized.

With that said, there’s more to the meta ai app than just its personalization and memory capability, and some of that that features are farely unique to the meta ai experience. So, after downloading and experimenting with the new meta ai app, here are three big features to check out.

A social discover feed to give you ideas


Credit: Screenshot Courtesy of Meta

Let’s start with the most obvious one, and that’s the discover feed. Upon opening the app, you’ll be able to engage with it by tapping on the compass icon. It works almost exactly as you would expect. People use meta ai to generate answers to questions, images, and other such things, and then there posts are shared to the feed for you to engage with.

Mashable light speed

You can like, comment, or share anything you see there. A fourth button appears to load the same prompt into your own meta ai conversation, so you can see what you get when you are ask meta ai the same question. DURING MY TESTING, I SAW Someone Post An Image With the Prompt “Imagine me miley cyrus at beyoncé’s Cowboy Carter Tour.” The image it generated for me was different from the one in my discover feed. Now, Whather Meta’s AI is supposed To be generating quasi-photorealistic images of public Figures is another question entryly.

Near as I can tell, the discover feed has two important uses. The first is showing off what meta ai can do while giving you Yet another thing to doomscrollThe other is giving users fresh ideas on what they can ask meta ai about. During my brief time on discover, I found people Asking about mars colonizationWhat colors would work for their wardrobe, and loads of stuff about the catholic church in the wake of Pope francis’ passing. In short, it not only services entertainment value, but also as an idea generator, especially when the Next wave of ai trends hits the market,

Hardware support

The Meta AI App also supports The Ray-Ban Meta Smart GlassesIn fact, meta is Replacing the existing meta view Companion app with the meta ai app, so this app you’ll need to use moving for your smart glasses. It’s easy enough to get to. Just open up the app and click the glasses icon to add your smart glasses, and then continue using them as normal from there.

Per meta, once you get everything synced up, you’ll be able to start a conversation on the glasses and continue it on the app. Chat history will also be accessible through the meta ai app, and it’ll all be integrated with conversions you have on the app naturally. Meta does not that you won’t be able to start a chat from the app and then continue on the glasses. Even So, Openai, Xai, and Google Certainly du this type of hardware integration.

I don’t personally oven a pair of the smart glasses, so there are likely some extra extra little things that I have seen that meta didnys’st put in the press release. Even so, Direct Hardware Support is Something Chatgpt doesn Bollywood.

screenshot from meta ai app


Credit: Screenshot Courtesy of Meta

Full-Duplex Voice Mode

This one isn’t particular new or unique, but it’s the first such implementation for meta’s ai. For the uninitiated, full-Duplex Voice Mode DesCribes the feature when you can chat with ai in real-time. You Talk, It Responds, You Respond Back, So on and So Forth. A less ai chatbots have this feature alredy.

Meta uses it differently, thought. While you can still chat with meta’s ai in both directions, full-duplex voice mode also changes also changes how the ai talks back. It integrates natural human language, like pauses, along with filler words like “Umm.” This is demonstrably different from how the ai typically talks to you, so it’s something different for people who want that.

The app says that the feature is in beta and not use the most updated knowledge base like the regular ai voice, so you’ll likely get worse answers if you use it. Once it Hits primetime, it’s a neat little addition. In the meaning, the regular ai voice has options for John Cena, Awkwafina, Judi Dench, Keegan-Michael Key, and Kristan Bell.

screenshot from meta ai app


Credit: Screenshot Courtesy of Meta

Leave a Comment