Criticized for its enormous errors, the “Lucie” AI supported by the government “temporarily closes”

The AI ​​race is launched, even if it means exhausting the researchers. Between Chatgpt and Gemini slips “Lucie”an AI developed by Linagora in collaboration with the CNRS and supported by the government. Except that nothing goes as planned.

Advertising, your content continues below

“Lucie” attracts the mockery of Internet users

However, the promise is beautiful: “Lucie” is the first model of French generative open source aligned with European values ​​and provided for educational use in 2025 while seeks its sovereign solutions.

Internet users quickly noted the limits of AI. For example, for the simple mathematical operation “5(3+2)”, “Lucie” Responds 17 and then 50 when asked to detail his reasoning. The model also argues that eggs are “produced by cows”.

The aberrant responses have multiplied in recent days, causing the mockery of Internet users. Users have tested its limits with surprising results. For example, “Lucie” calculated that “The square root of a goat is 1”. Suffice to say a big gap with the displayed ambitions that speak of an AI “Particularly transparent and reliable”. However, it is a model designed for education, government and research.

Faced with these failures, Linagora announced the “Temporary closure” de Lucie.Chat. The company specifies that AI is in the experimental phase, hence these errors. The company specifies that this is an academic research initiative to demonstrate the feasibility in the development of generative IA digital resources in open source.

It should also be noted that no work has yet been done with national education to adapt the model to educational use while experts are concerned about the presence of technology at school.

-

Linagora admits the biases and errors of “Lucie”

The project was developed with the OpenLM-France community as part of a France 2030 call for projects. “Lucie” It is only at its beginnings and Linagora explains that the AI ​​works with minimum settings and has no safeguards to prevent inappropriate use. The company admits that the answers have biases and errors at this stage of development.

According to the creators of “Lucie”its so fast commissioning was an error in judgment, motivated by the desire to follow the logic of opening and co-construction of open-source projects. The company recognizes that AI has reasoning limits in terms of simple mathematics and code generation. Linagora regrets not having communicated better on these limitations to avoid unrealistic expectations.

Closing “Lucie” temporarily, the teams wish to take the time to explain their approach. Linagora calls for understanding and respecting the work of researchers who seek to create an ethical, transparent and confidence.

Advertising, your content continues below

France
Lifestyle

-

--

PREV train traffic disrupted in the Channel
NEXT “Black Box Diaries”, the documentary on sexual assault by Itô Shiori, nominated for the 2025 Oscars