LogoLogo
X/TwitterMediumGitHubDePHY Website
  • Introduction
    • What is DePHY
    • Key features
    • Architecture
  • Components
    • Messaging layer 🔥
      • Data Pub-Sub
      • Solana Integration
      • RPC Call
      • RPC Endpoints
      • Best Practices
      • Verifiable logs
    • DePHY ID [Coming Soon]
      • Register products
      • Mint DID
      • Use DID as token gate (access control)
      • Extentions
      • Build with DePHY ID
      • Hardware integration and attestation
    • Liquidity layer
      • Stake Pool
      • Yield Pool
      • PayFi Pool
    • Verification layer
      • Proof of real device
        • Integration
      • Proof of location
        • Integration
      • DePHY NCN (on Jito Restaking)
        • Integration
      • Trusted DePIN network map
        • Integration
  • Tutorials 🔥
    • Build a Hello World (Rust)
    • Build a Hello World (TypeScript)
    • Build a DeCharge Machine
    • Build a Gacha Machine
    • Build a LLM Proxy
  • Service Mesh
    • AI MCP Services
      • About MCP
      • How to Enable DePHY MCP Server
      • How to use DePHY MCP
  • DePHY Node 🕹️
    • Get a DePHY Node
    • Set Up a DePHY Node
    • Migration From Testnet1 to Testnet2
    • Node Setup FAQ
  • Resources
    • Blog
    • GitHub
    • X
    • Telegram
    • Discord
Powered by GitBook
On this page
  • What is MCP
  • Why MCP?
  • Why DePHY for MCP?
  1. Service Mesh
  2. AI MCP Services

About MCP

PreviousAI MCP ServicesNextHow to Enable DePHY MCP Server

Last updated 15 days ago

What is MCP

MCP is an open protocol that standardizes how applications provide context to LLMs. Think of MCP like a USB-C port for AI applications. Just as USB-C provides a standardized way to connect your devices to various peripherals and accessories, MCP provides a standardized way to connect AI models to different data sources and tools.

Why MCP?

MCP helps you build agents and complex workflows on top of LLMs. LLMs frequently need to integrate with data and tools, and MCP provides:

  • A growing list of pre-built integrations that your LLM can directly plug into

  • The flexibility to switch between LLM providers and vendors

  • Best practices for securing your data within your infrastructure

For more info about MCP please visit:

Why DePHY for MCP?

In short:

  • Hosting MCP servers requires professional prgramming skills

  • Running every one of MCP servers locally can cost a lot computer resource

  • Hosting unknown MCP servers leads to security problem

  • Some services(scraping, searching) need a lot of residential IPs

DePHY Service Mesh provides common MCP servers with one unified endpoint, enabling additional features for your AI clients.

https://modelcontextprotocol.io/introduction