Meta to launch Llama-3 LLM

Shruti Govil
Shruti Govil April 10, 2024
Updated 2024/04/10 at 3:46 PM

Within the following month, Meta said, it will release its Llama 3 large language model, which powers AI assistants. 

Chief Product Officer Chris Cox of the company discussed what appears to be several revisions of the product during a speech at an event in London. According to a TechCrunch story, the company plans to unveil Llama3 “within the next month, actually less, hopefully in a very short period of time.” 

He went on to say that Llama3 will be used to power several goods across Meta. 

With more features than its predecessors, Llama3 is anticipated to address the shortcomings of the preceding versions while also adding new ones. 

The public hasn’t been pleased with Meta’s cautious approach to AI. The original iteration of Llama was not made public, but it continued to surface online through leaks. The business eventually made Llama2 available to the public in July 2023. 

The Llama families developed by the company are open-source products that showcase an alternative approach compared to other developers’ methods. Instead than relying on more exclusive models, Meta hopes that its vast language models available for free will win over developers. Nevertheless, only a restricted number of individuals can utilize Emu, Meta’s text-to-image/video generating tool.

There will only be a limited edition.

In July of last year, Meta launched Llama 2, and the reason for that was probably just their desire to maintain a regular release schedule.

Building excitement about the potential of the next AI can be achieved by releasing a limited version of it early. Anthropic’s little model Claude 3 Haiku shares some of the same features as OpenAI’s big model GPT-4.

The field of AI models is expanding quickly and getting more competitive, especially with the introduction of new models from DataBricks, Mistral, and StabilityAI into the open source arena.

Due to their lower operating costs, ease of tuning, and sometimes the ability to operate on local hardware, smaller models are also becoming more and more valuable for enterprises.


Share this Article