web3 Research

web3 Research

Share this post

web3 Research
web3 Research
Next Bubble?The Future of the web3 AI Agent Market with MCPs
Report

Next Bubble?The Future of the web3 AI Agent Market with MCPs

Distributed MCP, ACP for Virtual Protocol, MCP×Base and Solana case studies.

mitsui's avatar
mitsui
Mar 27, 2025
∙ Paid

Share this post

web3 Research
web3 Research
Next Bubble?The Future of the web3 AI Agent Market with MCPs
Share

Good morning.
Mitsui from web3 researcher.

Today's article, titled "The Future of the web3 AI Market with MCP," is about the new AI agent market trends that are starting to occur now.

Since this is a relatively new technology in the AI market, there will be a lot of rather difficult discussions (however, I will not go into detailed technical details, but only explain the concepts).

Also, this may be a bit of an abstract description as there are still very few use cases in the new web3 area, but I am personally excited to be a key part of the next AI agent bubble for web3.Please read to the end!

📦What is MCP?
✨What happens when MCPs are integrated in web3?
💬The center of the next bubble?


📦What is MCP?

Let's start with an explanation of "MCP".This is not a crypto term.

MPC stands for "Model Context Protocol," an open standard protocol for connecting large-scale language models (LLMs) with external data sources and tools proposed by Anthropic, the developer of Claude, in November 2024.

https://www.anthropic.com/news/model-context-protocol

Simply put, It is a mechanism that links various AI models and diverse tools and data through "one common interface (connection standard)".

The challenge behind the creation of MCP isthe disconnect between AI and data.Even advanced LLMs could not directly access the latest information or internal data outside the scope of trained data, and custom implementations were required each time for integration with various databases and services.

If you are a regular AI user here, I think you will agree.It is already very advanced and useful, but "access to company-specific data" and "accurate real-time data" are still weak.

Web searches are now possible, but they are not accurate enough for real-time data because they pull answers from a diverse set of data.

If the current AI could also integrate in-house databases and connect to external APIs to collect and output even real-time data, it would be very useful.

Although it is still possible to integrate internal data and access data through external APIs, the standards for each are different, so custom development was required for each individual case.In addition, external data could only be collected if the published APIs were suitable for AI.

To solve this integration problem (combination of different models and numerous tools), Anthropic has released MCP as open source, a common protocol that anyone can use and implement.This allows LLM vendors and tool developers to implement according to a common specification, thereby increasing interoperability of AI systems and enabling scalable and sustainable integration.

USB-C port for AI applications, it aims to replace integration that has traditionally been addressed separately for each data source with a single standard protocol.The goal is to replace integration that has traditionally been handled separately for each data source with a single standard protocol.

This protocol allows the AI assistant (LLM) to securely and interactively access the external information it needs to generate more accurate and relevant answers.

https://modelcontextprotocol.io/introduction

If you also grant data access privileges to the output destination in advance, it will autonomously process the collected data and even perform actions such as, for example, Slack notifications, or listing the data in a Google Document.

These workflows can be smoothly established by unifying them under a common plan called MCP.

For example, typical use cases for MCP are as follows

  • Corporate Data Retrieval Assistant: If an MCP server is set up in an internal document management system or knowledge base, LLMs can retrieve the relevant material as needed, summarizeand answer questions as needed.For example, if you ask for a summary of last year's sales report, AI will open the file on Google Drive via MCP, read the content, and summarize it. AI Coding Assistant

  • AI Coding Assistant: With an MCP connected to the development environment (IDE) and code repository, an AI can build an agent to view and edit code.For example, when reviewing a pull request, AI can automatically retrieve change diffs (resources) and, if necessary, invoke a test execution tool to point out problems or suggest fixes.

  • Business Automation/RPA: In this scenario, AI executes various service APIs for repetitive tasks such as scheduling and data entry.AI can register appointments and send notifications according to the user's instructions.For example, if you ask AI to "adjust the meeting schedule for next week," it will search for available time on the company calendar and automatically suggest a date and time.

This may be a little difficult to understand, but I hope you got the idea.

Please refer to the Anthropic page listed in the caption of each image for detailed technical details.


✨What happens when MCP is integrated in web3?

So what will happen when this MCP is integrated into web3?

Keep reading with a 7-day free trial

Subscribe to web3 Research to keep reading this post and get 7 days of free access to the full post archives.

Already a paid subscriber? Sign in
© 2025 mitsui
Privacy ∙ Terms ∙ Collection notice
Start writingGet the app
Substack is the home for great culture

Share