Apple is in talks with OpenAI and Anthropic to integrate their language models into Siri, the company said in a statement. The company is responding to criticism of the limited capabilities of its voice assistant, which currently lags behind its competitors in recognizing context and conducting complex conversations.
According to Bloomberg, the company is considering adding third-party AI to Siri as it improves its own language models. It is currently testing both OpenAI and Anthropic internally, including their compatibility with Apple's private cloud servers. Preliminary results suggest that Anthropic's models are better suited to Siri's needs, although negotiations are still ongoing.
Analysts say Apple is being extremely cautious, particularly due to the financial demands of Anthropic, which is reportedly seeking a multi-billion dollar contract. At the same time, the company is not ruling out other options as it seeks to find the best solution for implementing new features.
Although Apple has already announced Apple Intelligence as the basis for new Siri capabilities, the release of the features was postponed to 2026. Due to negative user feedback, plans changed, and the launch of the updated Siri is expected in iOS 26. Potential cooperation with OpenAI or Anthropic would allow the company to implement the promised features faster, without abandoning the development of its own models.