πŸ”’ Open Source Β· MIT License

Your data stays yours.

LOG-mcp is a privacy middleware for AI. Strip PII before it reaches any AI provider. Rehydrate the response. Zero trust, zero leaks.

⭐ Star on GitHub
πŸ›‘οΈ

PII Stripping

Regex-based detection for emails, phones, SSNs, credit cards, API keys, addresses, and IPs β€” replaced with typed placeholders.

πŸ”„

Transparent Proxy

Drop-in replacement for OpenAI's API. Same format, same streaming. Your app doesn't know the difference.

☁️

Edge-First

Runs on Cloudflare Workers with KV storage. Sub-10ms overhead. No cold starts. Global by default.

Try It

Paste text with PII and watch it get stripped β€” entirely client-side.

Result appears here…

Architecture

Client
Your App
β†’
LOG-mcp
πŸ›‘οΈ Vault
β†’
Provider
OpenAI / etc
β†’
LOG-mcp
πŸ’§ Rehydrate
β†’
Client
Clean Response

Get Started

Deploy the Worker to Cloudflare. PII stored in KV, zero infrastructure.

git clone https://github.com/lucineer/LOG-mcp.git
cd LOG-mcp/cloudflare/worker
npm install

# Create KV namespace for PII mappings
wrangler kv namespace create PII_MAP

# Set your AI provider API key
wrangler secret put API_KEY

# Deploy
wrangler deploy

Run the full Python backend locally with the Cloudflare Worker as edge proxy.

# Deploy worker first (see Cloudflare tab)
cd LOG-mcp
pip install -e .

# Configure local vault
log-mcp init --provider openai --model gpt-4

# Start MCP server
log-mcp serve --port 8000

# Update worker PROVIDER_ENDPOINT to point to your local server

Full stack with Docker Compose β€” vault, optional Ollama for local LLM, and Cloudflare Tunnel.

git clone https://github.com/lucineer/LOG-mcp.git
cd LOG-mcp
cp .env.example .env
# Edit .env with your API keys

docker compose -f cloudflare/docker/docker-compose.yml up -d

# Vault: http://localhost:8000
# Health: http://localhost:8080