Apple is working on artificial intelligence tools, including its own large language model framework called “Ajax” and a chatbot service called “Apple GPT.”
The project has become a major effort within the company as it tries to catch up with competitors like OpenAI and Google. The company has several teams working together to develop and address privacy concerns related to AI technology, Bloomberg reports.
Although Apple has been integrating AI into its products for years, the company is now aiming to make a major AI-related announcement next year. Currently, there is no clear strategy for releasing AI technology directly to consumers, with executives taking different approaches on how to move forward, according to Bloomberg.
Siri’s stagnation may be ending, thanks to Apple’s advances in AI
Despite improvements in AI capabilities in other Apple products, the Siri voice assistant has stagnated. Apple executives are considering integrating AI tools into Siri to improve its functionality and keep up with AI advances.
Apple’s AI development has been plagued by problems and delays, according to an April report from The Information. The tech giant has lost at least three AI engineers to Google, and has struggled with organization and ambition in artificial intelligence.
Issues with Siri’s development were also highlighted, with answers still being evaluated and edited by humans for security reasons. Apple’s strong stance on privacy has supposedly slowed Siri’s progress in recent years.
In May, the Wall Street Journal reported that Apple had banned the use of ChatGPT and Microsoft’s Github Copilot for fear that confidential information could be leaked through these tools. The WSJ also reported on Apple’s plans to develop its own language model or generative AI system.
At its developer conference in June, Apple made it clear that it’s into AI, though it only talked about “machine learning.” But they did a lot of that.
In a post-show interview, Apple CEO Tim Cook praised the potential of large language models like ChatGPT, but emphasized the need for responsible development and use because of risks like bias and misinformation.