The race for AI is on, even if it means exhausting researchers. Between ChatGPT and Gemini slips “Lucie”an AI developed by Linagora in collaboration with the CNRS and supported by the government. Except that nothing goes as planned.
Advertising, your content continues below
“Lucie” attracts ridicule from Internet users
However, the promise is beautiful: “Lucie” is the first open source French generative AI model aligned with European values and planned for educational use in 2025 as France seeks its sovereign solutions.
Internet users quickly noticed the limits of AI. For example, for the simple mathematical operation “5(3+2)”, “Lucie” answers 17 then 50 when asked to detail his reasoning. The model also suggests that eggs are “produced by cows”.
Aberrant responses have multiplied in recent days, provoking ridicule from Internet users. Users have tested its limits with surprising results. For example, “Lucie” calculated that “the square root of a goat is 1”. Suffice to say a big gap with the ambitions displayed which speak of an AI “particularly transparent and reliable”. However, it is a model designed for education, government and research.
Faced with these failures, Linagora announced the “temporary closure” by Lucie.chat. The company specifies that the AI is in the experimental phase, hence these errors. The company specifies that this is an academic research initiative to demonstrate feasibility in the development of open source generative AI digital resources.
It should also be noted that no work has yet been carried out with National Education to adapt the model for educational use while experts are concerned about the presence of technology in schools.
-Linagora admits “Lucie’s” biases and errors
The project was developed with the OpenLLM-France community as part of a France 2030 call for projects. “Lucie” is in its early stages and Linagora explains that the AI operates with minimal settings and has no safeguards to prevent inappropriate use. The company admits that the answers have biases and errors at this stage of development.
According to the creators of “Lucie”its rapid commissioning was an error of judgment, motivated by the desire to follow the logic of openness and co-construction of open-source projects. The company recognizes that AI has reasoning limitations in terms of simple mathematics and code generation. Linagora regrets not having better communicated about these limitations to avoid unrealistic expectations.
By closing “Lucie” temporarily, the teams want to take the time to explain their approach. Linagora calls for understanding and respect for the work of researchers who seek to create ethical, transparent and trustworthy AI.
Advertising, your content continues below
Tech