“At Eiffage, we have developed our own private generative AI with Gemini”

“At Eiffage, we have developed our own private generative AI with Gemini”
“At Eiffage, we have developed our own private generative AI with Gemini”

The Eiffage group announced this Thursday, June 27, an extensive collaboration with Google Cloud to accelerate the adoption of artificial intelligence. Jean-Philippe Faure, chief information officer, unveils the construction group’s AI strategy.

JDN. What new use cases in generative AI have you been able to develop using Google models?

Jean-Philippe Faure is the CIO of Eiffage. © Cyrille Dupont / The Pulses

Jean-Philippe Faure. We have developed our own private generative AI with Gemini. We integrate all the technical reports classified by type of project. Thus, regardless of the geographical location in France, our employees can benefit from the experience of their colleagues on all responses to calls for tenders. For example, if someone asks for the characteristics of air conditioning for a 10,000 square meter building under renovation, they will be able to retrieve all the relevant information from similar projects. Of course, the result is only a working document. Each employee then cross-references their professional experience with the information provided by the AI, adapting the options according to their specific scenario.

“We are currently working on the integration of multimodal data, such as construction site photos and videos.”

Another use case concerns site reports. Every week, our teams meet with all the site stakeholders to take stock of progress. We have developed a solution to automatically generate a report from an audio recording of the meeting, based on Gemini. Although the generated report always has to be reread and adjusted, this saves our employees valuable time. Our goal is not to replace humans, but to enable them to complete certain time-consuming tasks more quickly.

We are currently working on the integration of multimodal data, such as construction site photos and videos, to further enrich these automated reports. We are conducting a pilot with our regional Route Île-de-France department, because this solution is of real interest to our works managers by saving them time. Ultimately, we also want to use this technology to generate site feedback. By integrating time-stamped emails, photos and videos from a 3-year project, we will be able to generate a first version of 80 pages, for example, of the rex, thus offering real added value to our teams.

How many employees have access to Gemini?

We have deployed Gemini to about a hundred people and are continuing to expand it. We conduct feedback at each stage to assess the results. We collaborate with different departments, including human resources, operations and IT. We also involve close collaborators who have the ability to discern whether AI is “hallucinating” or not.

Do you also use machine learning systems for more complex use cases?

Yes, I can give you two examples. The first is called Metronia, a solution deployed on the Toulouse metro site to manage the vibrations generated by the tunnel boring machine. It’s a huge machine that digs the subway corridor, causing significant vibrations. Our objective is to monitor these vibrations using sensors installed on the site, up to 1,000 sensors that we move as the work progresses. We collect around 500,000 messages per day before sending them to the cloud, which represents around 10,000 daily calculations. This allows us to monitor what is happening around the site during digging and to check if everything is going correctly. The incident on the A13, where an accidental shovel caused cracks in the road, clearly illustrates the importance of this monitoring. Sensors collect data locally, while storage, transformation and calculations are performed in the cloud. We are in the process of transferring to GCP after testing different solutions.

The second example concerns the prevention of construction site accidents. With Dataiku, we have developed a solution for assessing risks on construction sites. We analyze the history of construction site visits and the resulting observations, the time records of internal and external staff, their seniority, the type of employment, the supervision rate of the construction site, as well as open data weather data. From this information, our proprietary algorithm defines a risk level for each construction site, taking into account the seniority of employees, the results of the last visits, the supervision rate and weather forecasts.

How do you define AI use cases with the best ROI?

We define AI use cases with the best ROI potential when they meet a need common to all our branches of activity. Even if our businesses are all related to construction, they remain different. So, when the 15 branches agree on priorities, it really makes sense. Our goal is to offer a service to as many people as possible, while avoiding any cronyism. We work in a tight-knit team, around forty people in collaboration with the branches, and everyone is a stakeholder in the project.

“Google Cloud’s philosophy of openness interests us much more than a closed ecosystem”

Of course, there will probably be some “versioning” depending on the branches, because some will be more sensitive to this or that criterion. But the basic model will remain the same, only the sensitivity of the model may vary. The more standard models we can generate, the more value we will create. Our goal is to standardize our needs as much as possible to avoid developing overly specific solutions that do not enrich the entire company.

Why did you choose Google Cloud for your AI strategy rather than another provider like Azure OpenAI for example?

Our choice of Google Cloud for our AI strategy is explained by our desire for openness. Historically, we are very linked to Microsoft, with a strong dependence on their solutions, even though Azure is an excellent platform and Microsoft a remarkable company. However, we needed a more open approach. Working with Google, we appreciated their transparency: they showed us how to replace each component of their solution with alternatives if necessary. This philosophy of openness interests us much more than a closed ecosystem like that of Microsoft. Despite Microsoft’s undeniable qualities, it is extremely complicated to extract oneself from their environment once one is integrated into it. We didn’t want to add an additional dependency to the ones we already have.

-

-

PREV Pressure mounts on Switzerland to regulate artificial intelligence
NEXT School bullying: family falls through cracks in access to justice