Elon Musk releases 318 GB torrent and makes Grok open source
Économie

Elon Musk releases 318 GB torrent and makes Grok open source

At war with OpenAI and its business models, Elon Musk promised in early March that he would open source the language model behind Grok, his chatbot. Promise kept.

GPT-4, Gemini Pro, Llama 2 and Mistral Large now know the name of their rival: Grok-1.

Since March 18, Elon Musk’s teams have been offering for download a giant file of 318.24 GB, accessible in the form of a torrent file (in bio on his Twitter account). It contains a language model with 314 billion parameters, developed by the company xAI. It is Grok-1 that powers the Grok chatbot, which has been available on X since late 2023. Mistral performed the same torrenting technique in December 2023.

By clicking on the magnet link you can download a 318.24 GB language model.
By clicking on the magnet link you can download a 318.24 GB language model. // Source: Numerama

Can Grok become a serious alternative to other language models?

Touted by many as a satirical version of ChatGPT (it can insult its users and make vulgar jokes), the Grok chatbot has never been considered a serious competitor in the generative artificial intelligence war. With its transformation into an open source model, and therefore usable by anyone who wants it, the situation may change.

Grok-1 is no longer a language model exclusive to Elon Musk’s service, but can be tested locally by people with powerful machines. The developers also published it on Github, encouraging the AI ​​community to test Grok-1. Elon Musk’s LLM becomes one of the references in the field of open source, next to the French company Mistral and the giant Meta (Google and OpenAI have a more closed approach).

Elon Musk is sometimes opaque, but he believes in open source.
Elon Musk is sometimes opaque, but he believes in open source. // Source:

In a blog post, xAI explains that the model published in torrent dates from October 2023 and that it is based on an architecture called Mixture-of-Experts (MoE). This is a model with limited capabilities, as it is not designed to directly respond to specific needs (which developers can adapt to). Each token can use up to 25% of Grok-1’s total capacity, making it more efficient.

The official ChatGPT account responded ironically to Grok-1's post, prompting Elon Musk to criticize OpenAI again.
The official ChatGPT account responded ironically to Grok-1’s post, prompting Elon Musk to criticize OpenAI again. // Source:

In early March, Elon Musk announced he was filing a complaint against OpenAI, which he accuses of lying about its intentions (Elon Musk helped found the company before walking away from the project). OpenAI explains that Elon Musk dreamed of merging the company with Tesla, which caused a dispute. With xAI, Elon Musk hopes to create a counter model to that of OpenAI, in the hope of damaging the company.


Subscribe to Numerama on Google News so you don’t miss any news!

Hi, I’m laayouni2023