Meta's flagship large language model (LLM) Behemoth was scheduled to be released in April, but the company has delayed its release for the second time, this time until the fall. According to Reuters, the delay is due to developer concerns about the model's functionality.
Meta engineers are having a hard time making significant improvements to the Behemoth. There are doubts within the company whether the improvements over previous models are significant enough to justify launching a new LLM for widespread use.
The Llama 4 Behemoth was originally scheduled to be released at Meta’s AI Developer Conference in April, but was pushed back to June. Now, the company is again delaying the model’s launch — at least until the fall, and possibly beyond. Meta declined to comment on the reports.
In April, Meta announced that it was already testing the Llama 4 Behemoth and called the new model "one of the smartest LLMs in the world and our most powerful to date, which will become a teacher for future models."
Instead, early last month, the company released two other models – the Llama 4 Scout and Llama 4 Maverick, which Meta says have similar performance to Google Gemini 3 and GPT-4o, respectively.
Meta's struggles to release a new breakthrough model are in line with general trends in the AI industry. Talk of a new flagship model, GPT-5 from OpenAI, has been going on since last year, but the ChatGPT maker has repeatedly delayed its release, instead releasing smaller versions like the o3 and 04-mini.
Sam Altman, CEO of OpenAI, said in early April that there are many reasons for postponing the release of GPT-5, but the main one is that the company will be able to "make GPT-5 much better than expected."
However, in November 2024, it was reported that both OpenAI and other leading AI companies were struggling to create more powerful LLMs. At the time, it was noted that GPT-5, known internally as Orion, was not living up to expectations and, like Meta's Behemoth, was not a significant improvement over the current generation.
The main challenge is finding enough new data and information to train them. The second major challenge for some companies is money, as developing and training larger models requires significantly more resources.