This new lab called Kyutai will be a privately-funded nonprofit working on artificial general intelligence.
The article is about Kyutai, a French AI lab with an objective to compete with chatgpt and others with full open source (research papers, models, and training data).
That's thousands of salaries for a year, that's not too bad for an unknown company. More than enough to produce something that can attract more funding. Many startups became successful with less funding.
$330m is not nothing. But, with a funding split between a telecom CEO, and a shipping & logistics CEO - person has to wonder what sort of direction & tuning the team might be encouraged to explore. How will they stack up against existing & proven open source non-profits with impressive releases like EleutherAI?
These open source projects are neat, in that they give the average person the opportunity to peek under the hood of an LLM that they'd never be able to run on consumer level hardware. There are some interesting things to find, especially in the dataset snapshots that Eleuther made available.
In general, kind of cool to see France being on the cutting edge of these things. And I think it's worth saluting any project that moves to decentralize power from states and megacorps, who seal wonderful, powerful things in black boxes.
France is on the cutting edge of AI indeed, the FAIR (Facebook AI lab) has a big office in Paris and its boss is Yann Le Cun. So there are plenty of researchers getting trained on the state of the art.
I'm sorry are you crazy? Do you know any part of the internets history? American universities, government and defense contractors that created the internet.
Ideally, they'd just blow the entire $330M training an LLM, and release the weights. In reality, much of that money will probably go into paying salaries, various smaller research projects, etc.
The context is that LLMs need a big up front capital expenditure to get started, because of the processor time to train these giant neural networks. This is a huge barrier to the development of a fully open source LLM. Once such a foundation model is available, building on top of it is relatively cheaper; one can then envision an explosion of open source models targeting specific applications, which would be amazing.
So if the bulk of this €300M could go into training, it would go a long way to plugging the gap. But in reality, a lot of that sum is going to be dissipated into other expenses, so there's going to be a lot less than €300M for actual training.
This morning at Scaleway’s ai-PULSE conference, French billionaire and Iliad CEO Xavier Niel gave some extra details about his plans for an AI research lab based in Paris.
Six men took the stage this morning to talk about their previous work and what they have in mind for the research lab — Patrick Perez, Edouard Grave, Hervé Jegou, Laurent Mazaré, Neil Zeghidour and Alexandre Defossez.
Kyutai has also put together a team of scientific advisors who are well-known AI researchers — Yejin Choi, Yann LeCun and Bernhard Schölkopf.
“When it comes to the timeline, I don’t think our aim is necessarily to go as fast as Mistral, because our ambition is to provide a scientific purpose, an understanding and a code base to explain the results,” Defossez said at the press conference.
Macron also used this opportunity to define and defend France’s position on Europe’s AI Act, saying that use cases should be regulated, not model makers.
It’s not a question of defining good models, but we need to ensure that the services made available to our citizens are safe for them, for other economic players and for our democracy,” Macron said.
The original article contains 905 words, the summary contains 192 words. Saved 79%. I'm a bot and I'm open source!
Smart. Even Google knows that they can't compete with open source models since open source development of AI models is much more optimized and a compliance serving model can't catch up with it.
So an open source model is their best way to leapfrog these giants.
It seems like their goal is not to train new LLMs, but to actually do scientific research. Large language models are such a tiny part of the whole machine learning and AI field that it's ridiculous the amount of attention they get from mass media. But people do like their stupid chatbots.
What use would an AI be if it was made by French developers? The source would likely be in French (i.e. Variables, functions, objects names as well as comments). Yes, they are that in love with their own language. Check out their names for about everything related to computers...
Ive never seen french code in my jobs, it's in English, Most Frameworks are in English anyway so why would they code in French
PHP Symfony is from a french company, and it's in English, docs also available in English
And there might be translation of english words in French yes, how is it crazy, that's the definition of a language otherwise we would all have the same words for everything and therefore the same language
As a SW Engineer from Germany, you will be surprised how much code exists in other languages. But I would expect companies on the edge of technology, who are either working closely with universities or with open source, that they usually chose English.
I'll take this opportunity to highlight that Scikit-Learn (Open Source ML library) is developed in large part by INRIA (based in Paris) and people have been relying on their code for preprocessing, baselines, and the rest for a long time. And all of the documentation is in English.
On the other side, MicMac which is by far the best free photogrammetry package, is developed by France's IGN and it's loaded with French comments, function and variable names etc...
However the English wiki has come a LONG way since I first had to try to figure it out, and while it's still much more of a box of tools and parts than a single click app, it's likely gone from "set of blueprints and sack of unsorted bolts" to "kit car with rolling chassis"
Probably not, this is Quebec. I'm in a french lab and everything is written in English. You don't really have choice as you are collaborating internationally. Even if the lab is based/funded in France, not all of the people inside will be french.
They plan to have scientific advisors that are not french according to the link.
Eh get the AI to translate the code to your language of preference.
Personally as an Italian, I think it would be good for Europeans to learn other languages aside from English... And the most widespread are French and Spanish.
I went through school, having classes for Spanish for 4 years and Italian for two. It was obligatory to choose a third language. The second I was done with school, I had already forgotten everything I learnt of those languages. If I am not gonna use those languages daily I will forget them. It was a monumental waste of the students time. Now I have a spouse that speaks another language, that was never an option to learn in school. We both speak english though. There is little way to predict what languages will be useful for a student. English as a second language is a good bet. Every other european, except Russians, have been able to communicate with me in English. 99% will never benefit from a third language. They should have taught computer science or something instead. Before LLMs I thought they should at least have focused on teaching languages that were hard to machine translate. Like Russian, Japanese or Korean. That is my opinion.