by 
Source: www.mediapost.com, July 2024


After months of build-up, Meta’s latest version of its Llama artificial intelligence model — Llama 3.1 — has landed.

The model was created in partnership with computing-chip distributor Nvidia. It is available in three unique versions — one of which marks Meta’s biggest open-source AI model yet, keeping the company in direct competition with tech giants and startups including Google, Amazon, OpenAI and Anthropic.

“Meta AI is on track to reach our goal of becoming the most used AI assistant in the world by the end of the year,” Meta CEO Mark Zuckerberg predicts in a recent Instagram post, adding that Llama 3.1 is “smarter, supports more languages,” has “better reasoning” and will now become available in more countries.

Compared to the Llama 3 models, Llama 3.1 is much larger, and it has 405 billion parameters trained with over 16,000 of Nvidia’s H100 GPU chips (one of which costs between $30K and $40K), which can carry out a variety of tasks including coding, answering basic math questions and summarizing documents in eight languages.

Meta trained Llama 3.1 405B with a dataset of 750 billion words and says that in doing so, introduced “more rigorous” quality assurance and data-filtering approaches in development stages while also using data generated by other AI models — or “synthetic data” — for fine-tuning.

According to Meta researchers, recent web data and more non-English data were used to expand the model’s ability to improve performance for non-English speakers and dial into current events.

To increase use of its free model, Meta has partnered with two dozen companies including Microsoft, Amazon, Google, Nvidia, Azure, Dell and Databricks to help developers make their own versions and train Llama 3.1 on custom data.

This approach diverges from other companies’ — including OpenAI, which sells access to their large language models and related services.

“We’re actively building partnerships so that more companies in the ecosystem can offer unique functionality to their customers as well,” Zuckerberg wrote in a recent blog post.

The ongoing partnership between Meta and Nvidia showcases a mutual reliance, with Meta requiring a consistent supply of cutting-edge GPUs to power and train its future Llama models, while Nvidia is able to bypass license fees while outside tech companies use their chips via Meta’s open-source models.