这是用户在 2024-11-26 17:25 为 https://www.anthropic.com/news/model-context-protocol 保存的双语快照页面,由 沉浸式翻译 提供双语支持。了解如何保存?
Announcements 公告

Introducing the Model Context Protocol
Model Context Protocol 简介

An abstract illustration of critical context connecting to a central hub

Today, we're open-sourcing the Model Context Protocol (MCP), a new standard for connecting AI assistants to the systems where data lives, including content repositories, business tools, and development environments. Its aim is to help frontier models produce better, more relevant responses.
今天,我们开源了模型上下文协议 (MCP),这是一种将 AI 助手连接到数据所在的系统(包括内容存储库、业务工具和开发环境)的新标准。其目的是帮助 Frontier 模型产生更好、更相关的响应。

As AI assistants gain mainstream adoption, the industry has invested heavily in model capabilities, achieving rapid advances in reasoning and quality. Yet even the most sophisticated models are constrained by their isolation from data—trapped behind information silos and legacy systems. Every new data source requires its own custom implementation, making truly connected systems difficult to scale.
随着 AI 助手得到主流采用,该行业在模型功能方面投入了大量资金,实现了推理和质量的快速进步。然而,即使是最复杂的模型也受到与数据隔离的限制,它们被困在信息孤岛和遗留系统后面。每个新数据源都需要自己的自定义实施,这使得真正连接的系统难以扩展。

MCP addresses this challenge. It provides a universal, open standard for connecting AI systems with data sources, replacing fragmented integrations with a single protocol. The result is a simpler, more reliable way to give AI systems access to the data they need.
MCP 解决了这一挑战。它提供了一个通用的开放标准,用于将 AI 系统与数据源连接起来,用单一协议取代碎片化的集成。结果是一种更简单、更可靠的方法,使 AI 系统能够访问所需的数据。

Model Context Protocol 模型上下文协议

The Model Context Protocol is an open standard that enables developers to build secure, two-way connections between their data sources and AI-powered tools. The architecture is straightforward: developers can either expose their data through MCP servers or build AI applications (MCP clients) that connect to these servers.
Model Context Protocol 是一种开放标准,使开发人员能够在其数据源和 AI 驱动的工具之间构建安全的双向连接。架构很简单:开发人员可以通过 MCP 服务器公开他们的数据,也可以构建连接到这些服务器的 AI 应用程序(MCP 客户端)。

Today, we're introducing three major components of the Model Context Protocol for developers:
今天,我们将为开发人员介绍模型上下文协议的三个主要组件:

Claude 3.5 Sonnet is adept at quickly building MCP server implementations, making it easy for organizations and individuals to rapidly connect their most important datasets with a range of AI-powered tools. To help developers start exploring, we’re sharing pre-built MCP servers for popular enterprise systems like Google Drive, Slack, GitHub, Git, Postgres, and Puppeteer.
Claude 3.5 Sonnet 擅长快速构建 MCP 服务器实施,使组织和个人能够轻松地将其最重要的数据集与一系列 AI 驱动的工具快速连接起来。为了帮助开发人员开始探索,我们为 Google Drive、Slack、GitHub、Git、Postgres 和 Puppeteer 等流行的企业系统共享预构建的 MCP 服务器。

Early adopters like Block and Apollo have integrated MCP into their systems, while development tools companies including Zed, Replit, Codeium, and Sourcegraph are working with MCP to enhance their platforms—enabling AI agents to better retrieve relevant information to further understand the context around a coding task and produce more nuanced and functional code with fewer attempts.
Block 和 Apollo 等早期采用者已将 MCP 集成到他们的系统中,而包括 Zed、Replit、Codeium 和 Sourcegraph 在内的开发工具公司正在与 MCP 合作以增强其平台,使 AI 代理能够更好地检索相关信息,以进一步了解编码任务的上下文,并以更少的尝试生成更细致和功能化的代码。

"At Block, open source is more than a development model—it’s the foundation of our work and a commitment to creating technology that drives meaningful change and serves as a public good for all,” said Dhanji R. Prasanna, Chief Technology Officer at Block. “Open technologies like the Model Context Protocol are the bridges that connect AI to real-world applications, ensuring innovation is accessible, transparent, and rooted in collaboration. We are excited to partner on a protocol and use it to build agentic systems, which remove the burden of the mechanical so people can focus on the creative.”
“在 Block,开源不仅仅是一种开发模式,它是我们工作的基础,也是我们创造技术的承诺,这些技术可以推动有意义的变革,并为所有人提供公共产品,”Block 首席技术官 Dhanji R. Prasanna 说。“像 Model Context Protocol 这样的开放技术是将 AI 与现实世界的应用程序连接起来的桥梁,确保创新是可访问的、透明的,并且植根于协作。我们很高兴能在协议上合作,并使用它来构建代理系统,从而消除机械的负担,让人们可以专注于创意。

Instead of maintaining separate connectors for each data source, developers can now build against a standard protocol. As the ecosystem matures, AI systems will maintain context as they move between different tools and datasets, replacing today's fragmented integrations with a more sustainable architecture.
开发人员现在可以针对标准协议进行构建,而不是为每个数据源维护单独的连接器。随着生态系统的成熟,AI 系统在不同工具和数据集之间移动时将保持上下文,用更可持续的架构取代当今的碎片化集成。

Getting started 开始

Developers can start building and testing MCP connectors today. Existing Claude for Work customers can begin testing MCP servers locally, connecting Claude to internal systems and datasets. We'll soon provide developer toolkits for deploying remote production MCP servers that can serve your entire Claude for Work organization.
开发人员可以立即开始构建和测试 MCP 连接器。现有的 Claude for Work 客户可以开始在本地测试 MCP 服务器,将 Claude 连接到内部系统和数据集。我们很快将提供用于部署远程生产 MCP 服务器的开发人员工具包,这些服务器可以为您的整个 Claude for Work 组织提供服务。

To start building: 要开始构建:

An open community 开放的社区

We’re committed to building MCP as a collaborative, open-source project and ecosystem, and we’re eager to hear your feedback. Whether you’re an AI tool developer, an enterprise looking to leverage existing data, or an early adopter exploring the frontier, we invite you to build the future of context-aware AI together.
我们致力于将 MCP 构建为一个协作式开源项目和生态系统,我们渴望听到您的反馈。无论您是 AI 工具开发人员、希望利用现有数据的企业,还是探索前沿领域的早期采用者,我们都邀请您共同构建上下文感知 AI 的未来。