Artificial intelligenceGadgets

Meta Tdd AI To Ray-Ban Meta Smart Glasses In April

According to a report from The New York Times, Meta will officially integrate artificial intelligence into its Ray-Ban Meta smart glasses next month. Multimodal artificial intelligence functions such as simultaneous translation, object, animal and monument identification have been in early access since December last year.

Users can activate the glasses’ smart assistant by saying “Hey Meta” and then composing a message or asking a question. It will then respond through speakers built into the frames. In your report, the New York Times demonstrates how Meta’s AI works well when taking the glasses on trips to the grocery store, while driving, to museums, and even to the zoo.

While Meta’s AI could correctly identify pets and works of art, it didn’t always do it correctly. The New York Times found that the glasses had difficulty identifying zoo animals that were far away and behind cages. He also failed to correctly identify an exotic fruit called cherimoya after several attempts. As for AI translation, NYT found that the glasses support English, Spanish, Italian, French and German.

Meta will likely continue to improve its smart glasses over time. At this time, the AI ​​features in Ray-Ban Meta smart glasses are only available through an early access waitlist for US users.

Leave a Reply

Your email address will not be published. Required fields are marked *