Your Own MCP Registry, On Your Own Terms (Docker-Friendly, Open Source).

Build a self-hosted MCP Registry that GitHub Copilot can actually talk to—/v0 endpoints, CORS, versions, GitOps workflows, and the production guardrails that keep “tool chaos” from becoming your hobby.
Table of contents
- Why we need a registry (and why “just a spreadsheet” stops working)
- The real compatibility target: what GitHub Copilot expects
- Option A: Self-host an upstream registry
- Option B: Build a Minimal Viable Registry (MVR) in a weekend
- A practical data model: servers + versions + governance
- Docker-friendly FastAPI registry that matches Copilot’s contract
- Production hardening checklist
- Comparisons: upstream vs MVR vs “API catalog registry”
- Use cases that resonate with devs and AI VPs
- Key takeaways
Why we need a registry (and why “just a spreadsheet” stops working)
Let’s be honest: the first “registry” every team builds is a Notion page called “MCP servers we should probably remember.”
And it works… until it doesn’t.
Because the moment MCP servers stop being cute experiments and become real infra—CRM tools, internal search, incident bots, policy checkers—we need:
- A single source of truth (what tools exist)
- Governance (who approved what, where it’s allowed, what’s deprecated)
- Discoverability (for Copilot, IDEs, internal agent platforms)
- Repeatability (same tools across dev/stage/prod)
That’s what a registry gives us: a tool catalog with contracts—not just a list.
The real compatibility target: what GitHub Copilot expects
If the goal is “Copilot should discover our MCP servers,” don’t invent your own shape and pray.
GitHub’s guidance is explicit: a valid registry should follow the v0.1 MCP registry spec and expose these endpoints:
GET /v0.1/serversGET /v0.1/servers/{serverName}/versions/latestGET /v0.1/servers/{serverName}/versions/{version}
Important: GitHub notes the older v0 spec is unstable and should not be implemented. Build against v0.1.
CORS requirement (don’t skip this)
To let Copilot fetch the registry from a browser context, GitHub requires CORS headers on the /v0.1/servers endpoints, including:
Access-Control-Allow-Origin: *Access-Control-Allow-Methods: GET, OPTIONSAccess-Control-Allow-Headers: Authorization, Content-Type
Option A: Self-host an upstream registry
If you want the fastest path to “production-grade enough,” using an upstream open-source registry can be a win—especially if you want:
- a real persistence layer
- publishing workflows
- validation and schema evolution handled upstream
But here’s the nuance: “self-host” should still mean you control governance:
- approval workflows
- environment scoping
- safety policies (PII, secrets, data access)
- deprecation rules
Treat upstream as the engine—you provide the guardrails.
Option B: Build a Minimal Viable Registry (MVR) in a weekend (GitOps-first)
Sometimes you don’t need a full “tool marketplace.”
You need a Docker-friendly HTTP service that answers Copilot registry queries and is backed by YAML in Git.
This is the “small, sharp knife” approach:
- easy to deploy
- easy to audit
- PRs become the approval flow
- no database required
Non-negotiables:
- match v0.1 endpoints
- ship the required CORS behavior
A practical registry data model: servers + versions + governance
Copilot asks for server versions, so your model must include:
servers[]- each server has
versions[] - governance metadata belongs at server level and (optionally) per version
Create registry.yaml:
servers:
- id: "cohorte-search"
name: "Cohorte Search"
description: "Internal semantic search MCP server"
tags: ["search", "knowledge"]
status: "active" # active | deprecated | disabled
envs: ["prod"]
ownership:
team: "platform"
slack: "#ai-platform"
governance:
approved_by: "security@cohorte.co"
approved_at: "2025-12-01"
risk: "medium"
data_access: "restricted" # none | restricted | pii
prod_allowed: true
versions:
- version: "1.2.0"
transport:
type: "sse"
url: "https://mcp-search.cohorte.co/sse"
released_at: "2025-12-10"
- version: "1.1.0"
transport:
type: "sse"
url: "https://mcp-search.cohorte.co/sse"
released_at: "2025-11-01"Key principles (keep your future self employed)
- Registry entries should describe where and what, not how to authenticate.
- No secrets in the registry payload (ever).
- Treat the registry as an allowlist, not a suggestion box.
A Docker-friendly FastAPI registry that matches Copilot’s contract
This implementation includes:
- Required v0.1 endpoints (
/v0.1/...) - Required CORS behavior (including
Access-Control-Allow-Origin: *) - Safer Pydantic defaults (
default_factory) - SemVer-aware “latest” selection (avoids lexicographic traps)
app.py
from fastapi import FastAPI, HTTPException, Query
from fastapi.middleware.cors import CORSMiddleware
from pydantic import BaseModel, Field
from typing import List, Optional, Dict, Any
from packaging.version import Version, InvalidVersion
import yaml
import os
REGISTRY_PATH = os.environ.get("MCP_REGISTRY_FILE", "registry.yaml")
app = FastAPI(title="Self-hosted MCP Registry", version="0.1.0")
# GitHub requires permissive CORS for registry fetches.
# Note: allow_origins=["*"] means any website can read these responses in a browser context.
app.add_middleware(
CORSMiddleware,
allow_origins=["*"],
allow_credentials=False,
allow_methods=["GET", "OPTIONS"],
allow_headers=["Authorization", "Content-Type"],
)
class ServerVersion(BaseModel):
version: str
transport: Dict[str, Any] = Field(default_factory=dict)
released_at: Optional[str] = None
class ServerEntry(BaseModel):
id: str
name: str
description: Optional[str] = None
tags: List[str] = Field(default_factory=list)
status: str = "active"
envs: List[str] = Field(default_factory=list)
ownership: Dict[str, Any] = Field(default_factory=dict)
governance: Dict[str, Any] = Field(default_factory=dict)
versions: List[ServerVersion] = Field(default_factory=list)
def load_registry() -> List[ServerEntry]:
try:
with open(REGISTRY_PATH, "r", encoding="utf-8") as f:
raw = yaml.safe_load(f) or {}
servers = raw.get("servers", [])
return [ServerEntry(**s) for s in servers]
except FileNotFoundError:
return []
except Exception as e:
raise HTTPException(status_code=500, detail=f"Failed to load registry file: {e}")
def get_server(server_id: str) -> ServerEntry:
servers = load_registry()
s = next((x for x in servers if x.id == server_id), None)
if not s:
raise HTTPException(status_code=404, detail="Server not found")
return s
def semver_latest(versions: List[ServerVersion]) -> ServerVersion:
def key(v: ServerVersion):
try:
return Version(v.version)
except InvalidVersion:
# Non-semver falls back to raw string ordering
return v.version
return sorted(versions, key=key)[-1]
@app.get("/v0.1/servers")
def list_servers(
env: Optional[str] = Query(default=None),
tag: Optional[str] = Query(default=None),
status: Optional[str] = Query(default="active"),
):
servers = load_registry()
def keep(s: ServerEntry) -> bool:
if status and s.status != status:
return False
if env and env not in s.envs:
return False
if tag and tag not in s.tags:
return False
return True
filtered = [s.model_dump() for s in servers if keep(s)]
return {"servers": filtered, "count": len(filtered)}
@app.get("/v0.1/servers/{server_id}/versions/latest")
def latest_version(server_id: str):
s = get_server(server_id)
if not s.versions:
raise HTTPException(status_code=404, detail="No versions found")
return semver_latest(s.versions).model_dump()
@app.get("/v0.1/servers/{server_id}/versions/{version}")
def get_version(server_id: str, version: str):
s = get_server(server_id)
v = next((vv for vv in s.versions if vv.version == version), None)
if not v:
raise HTTPException(status_code=404, detail="Version not found")
return v.model_dump()
# Optional (nice-to-have): list all versions for humans/tools.
@app.get("/v0.1/servers/{server_id}/versions")
def list_versions(server_id: str):
s = get_server(server_id)
return {"versions": [v.model_dump() for v in s.versions], "count": len(s.versions)}Dockerfile
FROM python:3.12-slim
WORKDIR /app
COPY app.py registry.yaml /app/
RUN pip install --no-cache-dir \
"fastapi>=0.110" \
"uvicorn[standard]>=0.27" \
"pydantic>=2.0" \
"pyyaml>=6.0" \
"packaging>=23.0"
EXPOSE 8080
CMD ["uvicorn", "app:app", "--host", "0.0.0.0", "--port", "8080"]docker-compose.yml
services:
mcp-registry:
build: .
ports:
- "8080:8080"
environment:
MCP_REGISTRY_FILE: /app/registry.yaml
Run + test
docker compose up --build
curl "http://localhost:8080/v0.1/servers"
curl "http://localhost:8080/v0.1/servers/cohorte-search/versions/latest"
curl "http://localhost:8080/v0.1/servers/cohorte-search/versions/1.2.0"
What “great” looks like: production hardening checklist
1) Governance: treat the registry like an allowlist
Add fields like:
status: active | deprecated | disabledapproved_by,approved_atrisk,data_access,prod_allowed
Then enforce policy server-side:
- prod clients only see
active+prod_allowed - deprecated servers stay visible but clearly marked (with sunset dates)
- disabled servers are hidden or explicitly blocked
VP: “Can we guarantee Copilot only uses approved tools?”
Us: “Yes—if the registry enforces it, not just a wiki page.”
2) Environment scoping (the #1 “oops” moment)
Make env filtering first-class and default to safe behavior.
3) Authentication: never leak secrets into the registry
Registry entries should never contain:
- API keys
- OAuth refresh tokens
- internal headers
Instead:
- store secrets in Vault / AWS Secrets Manager / Doppler
- reference an auth profile (pointer, not secret), e.g.
authProfile: "jira-prod-oauth"
4) Supply-chain guardrails: prevent registry poisoning
Add:
- CODEOWNERS approvals for registry changes
- schema validation in CI
- domain allowlists (
*.cohorte.co, known SaaS domains)
5) Observability (yes, even for the registry)
Minimum:
- access logs
- request IDs
- rate limiting
- basic metrics by endpoint
6) Caching that respects reality
Registries are read-heavy:
ETag/Cache-Control- in-memory cache refreshed on interval
- optional CDN in front
Comparisons: which approach fits your org?
Self-host an upstream registry (best for platform teams)
Pros
- richer workflows and schema handling
- better long-term alignment with ecosystem evolution
Cons
- more moving parts
- governance is still on you
Minimal file-backed registry (best for “move fast, prove value”)
Pros
- ship in days
- GitOps native (PR = change control)
- easy to audit
Cons
- you own compatibility with the spec and client expectations
- you’ll reinvent workflows as adoption grows
“Registry via API catalog” (best for governance-heavy orgs)
If you already have an internal API catalog, you can back the registry with it—just ensure it still serves the required v0.1 endpoints and CORS behavior.
Use cases that resonate with devs and AI VPs
- “Approved tools only” for enterprise Copilot
- One tool catalog across IDE + internal agents
- Deprecation without chaos (sunset dates + migration paths)
Key takeaways (the stuff we’d pin in Slack)
- A self-hosted MCP registry isn’t about listing tools—it’s governance + discoverability + consistency.
- For Copilot compatibility, implement v0.1 endpoints (not v0).
- Don’t forget required CORS headers (including
Access-Control-Allow-Origin: *). - Keep secrets out of the registry—treat it like a catalog, not a credential store.
- Invest early in env scoping, approvals, and deprecation—future you will absolutely send present you a thank-you emoji.
If MCP is how tools enter the agentic era, the registry is how we stop that era from turning into a “who deployed what where?” mystery novel.
Our first registry can be small. But it should be intentional, compatible, and safe.
— Cohorte Team
December 22, 2025.