个人中心
登出
中文简体
返回
登录后咨询在线客服
回到顶部

Defeating Llama 2, Closing in on GPT-4! Mistral with a Valuation of €2 Billion for a 22-person Company in Just Six Months

Defeating Llama 2, Closing in on GPT-4! Mistral with a Valuation of €2 Billion for a 22-person Company in Just Six Months
The open-source marvel unfolds once again: Mistral AI has released the first open-source Mixture of Experts (MoE) large model.
Just a few days ago, a magnet link instantly sent shockwaves through the AI community.
An 87GB seed, an 8x7B MoE architecture – it looks like a mini version of the 'Open-Source GPT-4'!
Defeating Llama 2, Closing in on GPT-4! Mistral with a Valuation of €2 Billion for a 22-person Company in Just Six Months
Defeating Llama 2, Closing in on GPT-4! Mistral with a Valuation of €2 Billion for a 22-person Company in Just Six Months
No press conference, no promotional videos – just a magnet link that kept developers up at night.
This AI startup, founded in France, only shared three pieces of content after creating an official account.
In June, Mistral AI went live. A 7-slide PowerPoint presentation secured the largest seed funding in European history.
In September, Mistral 7B was released, touted as the strongest open-source model with 7 billion parameters.
In December, an open-source version of Mistral 8x7B, similar to the GPT-4 architecture, was released. A few days later, the Financial Times revealed Mistral AI's latest funding round of $415 million, valuing the company at a staggering $2 billion, an eightfold increase.
Now, with just over 20 employees, the company has set the record for the fastest growth in the history of open-source companies.
免责声明:社区由Moomoo Technologies Inc.提供,仅用于教育目的。 更多信息
5
+0
翻译
举报
浏览 10.3万
评论
登录发表评论
    avatar
    视频分享活动优秀贡献者
    news porter, welcome and respect all view~
    2047粉丝
    33关注
    5383来访
    关注