Anthropic unveils a standard for LLM access to contextual data

Anthropic unveils a standard for LLM access to contextual data
Anthropic unveils a standard for LLM access to contextual data

Anthropic announces an open source protocol to connect an LLM application in a standardized way to “contextual” data sources, files, API responses, etc.

Anthropic is launching an open source standard called MCP (for Multi Context Protocol) allowing LLMs to access context data. An SDK allows developers to create MCP servers that act as connectors to various data sources: computer files, databases, log files, development environment, API responses.

Via these MCP servers, the LLM application (Claude or other) is able to connect to the resources to read information and subscribe to updates. Anthropic naturally recommends ensuring that appropriate access to resources is implemented and audited.

For example, the solution could make it possible to query a PDF on the computer from the chat or to provide Github code without having to copy and paste. As such, it is in the same vein as the computer control tool unveiled by the same Anthropic in October.

“Whether you’re building an AI-powered IDE, improving a chat interface, or creating custom AI workflows, MCP provides a standardized way to connect LLMs with the context they need,” explains Anthropic in its documentation.

To make it easier for developers, the company provides pre-built MCP servers for various applications – Google Drive, Slack, GitHub, Git, Postgres and Puppeteer – as well as a repository. It also announces MCP Server support for Claude desktop applications. It remains to be seen whether OpenAI, Google and others will adopt the standard…

-

-

NEXT “A wake-up call for our planet”: new images show the frightening scale of Arctic ice retreat