Apple Developing AI That Understands Humans Better Than GPT-4
Apple researchers are developing an AI system that understands ambiguous references to objects on the screen, as well as contradictory context and background in conversations. “The ability to understand context, including references, is essential for voice assistants,” the authors write.
Apple described the system as a way to solve the problem of resolving links with large language models – it will help AI interpret the user’s description of objects on the screen and better understand the context of the conversation. As a result, ReALM will lead to a more intuitive and natural interaction with devices. Resolving references helps systems better understand natural speech by allowing users to use pronouns and other indirect references when speaking to AI. This aspect of human interaction has traditionally been a significant challenge for voice assistants, and ReALM reduces it to a problem solved at the language model level. The AI begins to adequately perceive references to visual elements on the screen and integrates these concepts into the flow of conversation.
Apple CEO Tim Cook has promised to make a big announcement in the field of AI systems this year. This time, it seems, he will be able to surprise everyone: Apple is discussing the possibility of cooperation with Google. New iPhones will get some Gemini-based features. The market reacted positively to the union of competitors – shares of Alphabet and Apple have already increased by 7% and 2%, respectively.