Go Micro
March 4, 2026 • By the Go Micro Team
Go Micro was recently given access to Claude Max through Anthropic’s open source sponsorship program. We wanted to share what we’ve built with it, why it matters, and where we’re headed.
Anthropic offers Claude Max access to open source projects. Go Micro applied because our MCP integration — making every microservice an AI tool — aligns directly with Anthropic’s Model Context Protocol. They agreed, and we got to work.
The result: three major features shipped in a single sprint, taking our Q2 2026 roadmap from 85% to 95% complete.
The MCP gateway previously supported HTTP/SSE and stdio transports. These work well for request/response patterns, but real-time AI agents need persistent, bidirectional connections.
We added a full WebSocket transport implementing JSON-RPC 2.0:
// Connect via WebSocket for bidirectional streaming
ws://localhost:3000/mcp/ws
What this enables:
The WebSocket transport supports the same JSON-RPC 2.0 protocol as stdio (initialize, tools/list, tools/call), so any MCP client that speaks WebSocket can connect.
// Agent connects and discovers tools
const ws = new WebSocket("ws://localhost:3000/mcp/ws", {
headers: { "Authorization": "Bearer my-token" }
});
// Initialize
ws.send(JSON.stringify({
jsonrpc: "2.0", id: 1,
method: "initialize",
params: { protocolVersion: "2024-11-05" }
}));
// List tools
ws.send(JSON.stringify({
jsonrpc: "2.0", id: 2,
method: "tools/list"
}));
// Call a tool
ws.send(JSON.stringify({
jsonrpc: "2.0", id: 3,
method: "tools/call",
params: {
name: "users.Users.Get",
arguments: { "id": "user-123" }
}
}));
This is particularly useful for the agent playground in micro run, where the browser maintains a persistent WebSocket connection for interactive AI conversations.
Production deployments need observability. We added full OpenTelemetry span instrumentation across all three MCP transports (HTTP, stdio, WebSocket).
import "go.opentelemetry.io/otel/sdk/trace"
// Add tracing to your MCP gateway
mcp.Serve(mcp.Options{
Registry: service.Options().Registry,
Address: ":3000",
TraceProvider: traceProvider, // Your OTel trace provider
})
Every tool call now creates a span with rich attributes:
Span: mcp.tool.call
mcp.tool.name: users.Users.Get
mcp.transport: websocket
mcp.account.id: agent-001
mcp.auth.status: allowed
mcp.rate_limit.allowed: true
This connects to your existing observability stack — Jaeger, Grafana, Datadog, whatever you use. You can now trace an AI agent’s tool calls through your entire service mesh.
The integration is backward compatible: if you don’t set a TraceProvider, spans are no-ops with zero overhead.
With the LangChain SDK already shipped, we built the LlamaIndex integration — enabling RAG (Retrieval-Augmented Generation) workflows with Go Micro services.
from go_micro_llamaindex import GoMicroToolkit
from llama_index.core.agent import ReActAgent
from llama_index.llms.openai import OpenAI
# Connect to your services
toolkit = GoMicroToolkit.from_gateway("http://localhost:3000")
# Create a ReAct agent with your service tools
agent = ReActAgent.from_tools(
toolkit.get_tools(),
llm=OpenAI(model="gpt-4"),
verbose=True
)
# The agent can now call your microservices
response = agent.chat("Get the profile for user-123")
The LlamaIndex SDK supports the same filtering as LangChain:
# Filter by service
user_tools = toolkit.get_tools(service_filter="users")
# Filter by pattern
blog_tools = toolkit.get_tools(name_pattern="blog.*")
# Combine with RAG
from llama_index.core import VectorStoreIndex
from llama_index.core.tools import QueryEngineTool
index = VectorStoreIndex.from_documents(documents)
rag_tool = QueryEngineTool(query_engine=index.as_query_engine(), ...)
# Agent has both document search AND service access
all_tools = [rag_tool] + toolkit.get_tools()
agent = ReActAgent.from_tools(all_tools, llm=llm)
This is powerful: an agent can search your documentation AND call your services in the same conversation.
Here’s where Go Micro’s MCP integration stands today:
| Metric | Value |
|---|---|
| MCP Gateway Code | 2,500+ lines |
| Test Coverage | 1,000+ lines, 35+ tests |
| Transports | 3 (HTTP/SSE, Stdio, WebSocket) |
| Agent SDKs | 2 (LangChain, LlamaIndex) |
| Model Providers | 2 (Anthropic Claude, OpenAI GPT) |
| Security | Auth, scopes, rate limiting, audit, OTel |
The Q1 2026 foundation is complete, Q2 is at 95%, and we’ve already delivered 50% of Q3’s production features ahead of schedule.
If you’re building microservices with Go Micro, your services are already AI-ready. Here’s what you can do today:
go mcp.Serve(mcp.Options{
Registry: service.Options().Registry,
Address: ":3000",
})
{
"mcpServers": {
"my-services": {
"command": "micro",
"args": ["mcp", "serve"]
}
}
}
toolkit = GoMicroToolkit.from_gateway("http://localhost:3000")
tools = toolkit.get_tools()
mcp.Serve(mcp.Options{
Registry: registry,
TraceProvider: otelProvider,
AuditFunc: func(r mcp.AuditRecord) { /* log it */ },
})
A note on the development process itself: we used Claude (via Claude Code) to implement these features. It wrote production Go code, ran the tests, fixed compilation errors, and iterated on the implementation. The WebSocket transport went from zero to 14 passing tests in a single session. The OpenTelemetry integration was designed, implemented, and tested in another.
This is exactly the kind of workflow that MCP enables. An AI agent that understands your codebase, calls your tools, and ships features. Go Micro is both the framework for building this and a beneficiary of it.
With Q2 nearly wrapped, we’re focused on:
/agent chat UI in micro run needs refinement for demos and daily developmentmicro-mcp-gateway as a production-grade, independently deployable binaryThe MCP ecosystem is growing fast. We think every microservices framework will have MCP support eventually — Go Micro just got there first.
# Install or update
go install go-micro.dev/v5/cmd/micro@latest
# Create a service
micro new myservice
cd myservice
# Run with MCP and the agent playground
micro run --mcp-address :3000
# Open http://localhost:8080/agent and chat with your service
See the MCP documentation and AI-native services guide for the full walkthrough.
Go Micro is an open source framework for distributed systems development. Star us on GitHub — we’re at 21K stars and growing.
Thanks to Anthropic for the Claude Max sponsorship through their open source program.