Thanks to its cloud subsidiary AWS, Amazon is doubling down on generative AI

Thanks to its cloud subsidiary AWS, Amazon is doubling down on generative AI
Thanks to its cloud subsidiary AWS, Amazon is doubling down on generative AI

DECRYPTION – AWS is investing in all directions to expand its infrastructure around the world and develop its ecosystem, explains its vice-president, Swami Sivasubramanian.

AWS is on track to cross $100 billion in revenue this year. If it were independent, Amazon’s cloud subsidiary would be among the 100 largest companies in the world. But she sees even bigger things. For several months, the company has been increasing its investment announcements with tens of billions of dollars to push its pawns in artificial intelligence throughout the world.

On the one hand, AWS is expanding its infrastructure of data centers and servers capable of handling the complex calculations necessary for the development of generative AI, which are very demanding in computing power. United States, India, Saudi Arabia, Emirates, Mexico, Spain (15 billion dollars over ten years), Singapore (8 billion), Germany (7.8 billion euros)… In total, the company plans to invest at least $150 billion over the next fifteen years, because businesses all over the world need storage capacity and computing power more than ever to use the full capabilities of these new technologies.

Read alsoHere, the color of the badge evokes seniority in the company: private visit to Amazon

We are currently seeing incredible activity in the AI ​​space, especially among startups in Europe with for example Hugging Face, which has given itself the mission of democratizing open source AI models, or Mistral AIwho also builds amazing foundation models “, explains Swami Sivasubramanian, vice president of AI and data at AWS, during his visit to Paris. Generative AI fascinates the public, but this technology goes far beyond a simple chatbot, like ChatGPT. It has the potential to significantly disrupt all industries “, he adds.

Investing more in specialized AI start-ups is also an integral part of AWS’s strategy. In addition to its 4 billion dollars injected into Anthropic, a company competing with OpenAI, Amazon participated in the fundraising of 220 million euros from the young French start-up H, which also develops foundation models. The American giant has also just announced an additional $230 million, in the form of credits to use its infrastructures, for young companies that use generative AI to solve complex challenges in different sectors.

A model platform

If AWS is working hard, it’s because the return on investment is already there. In the first quarter, its operating profit was $9.4 billion, contributing more than 60% to that of the entire Amazon group. Generative AI is already a growth driver for “ several billion dollars », affirmed in the first quarter Adam Selipsky, the CEO of AWS replaced last May by Matt Garman (until then responsible for sales and marketing). This is thanks to the acceleration of spending on its professional offering by companies, several of which are in the process of moving from the experimentation phase to production.

For companies building models, AWS provides infrastructure “ state of the art », with the best Nvidia components, its in-house AI chips (Trainium and Inferentia) and its Sagemaker software to build, train and deploy these models. To companies wishing to develop AI applications on already trained models, the giant offers its Bedrock platform of models and tools. We were the first to say that no model will rule the world alone, that there would be different models useful for different use cases. Today you can see that almost all cloud providers imitate our strategy “, emphasizes Swami Sivasubramanian.

Generative AI fascinates the public, but this technology goes well beyond a simple conversational robot like ChatGPT. It has the potential to significantly disrupt all industries

Swami Sivasubramanian, vice president of AI and data at AWS

To develop its own generative AI assistant for businesses, called Amazon Q, the group itself used several models to respond to different use cases (writing or debugging computer code, helping with analysis commercial data…). “ There may be some consolidation on the very large general-purpose models, but domain-specific models will explode in a big way », he adds.

More discreet about its strategic announcements in terms of generative AI than other major American technology players, Amazon may have been perceived for a time as late. “ There big difference is that we are not focused on a consumer chatbot, but on how to make every company an AI company, and increase the productivity of every employee within companies. So we tend to show that we deliver results first, and we let customers speak for us rather than the other way around » he emphasizes.

Data centers

However, Amazon is not forgetting the integration of generative AI for the general public. The company is testing a virtual assistant called Rufus that provides purchasing advice to American customers browsing its e-commerce site. CEO Andy Jassy promised, in his letter to shareholders in April, an even smarter Alexa » thanks to AI. A new version could notably allow it to hold conversations, provide more relevant responses or even perform tasks requiring the use of several applications. “ We are constantly innovating. And you’ll see more and more innovation powered by generative AI across a wide variety of Amazon’s businesses, like advertising. Nearly every aspect of Amazon is being reinvented with generative AI », confirms Swami Sivasubramanian.

Read alsoTikTok, Alibaba, Amazon… The rise of “Live Shopping” around the world is driving e-commerce

The announced generalization of the use of these technologies inevitably raises the question of the increase in energy consumption that will accompany it. “ We are investing heavily in making the model training process extremely energy efficient. And, with BedRock, we offer our customers the opportunity to choose the most efficient model instead of the largest model. This means that at runtime you not only save money, but also energy », Explains the manager.

In March, Amazon bought a data center in Pennsylvania, United States, for $650 million, directly powered by an adjacent power plant. Data centers are at the heart of the growth in electricity demand across the world. According to a study published by Goldman Sachs Research last May, data center energy demand will increase 160% worldwide by 2030, mainly because of AI. According to calculations by the International Energy Agency, a single query made on ChatGPT requires 2.9 watt hours of electricity compared to 0.3 watt hours for a Google search. “ For each layer of the technology stack, we need to find the right energy optimization and sustainability techniques, so that we can tackle this problem effectively. And that’s exactly what we’re doing », assures Swami Sivasubramanian.

-

-

PREV Indian central bank expected to hold rates on Friday as economic growth remains robust
NEXT Indonesian central bank will continue to intervene to stabilize the rupiah