Transform any command, script, or API into a standardized Model Context Protocol server that LLMs can use seamlessly. Complete gateway solution with TLS/HTTPS encryption, rate limiting, embedded analytics and enterprise security.
Local commands, containerized execution, and HTTP webhooks/APIs
Direct command execution with native performance
Docker-based execution with complete isolation
HTTP client for upstream API integration
Modular design with comprehensive monitoring
Multi-layered security for production deployments
End-to-end encryption for all traffic
Token bucket algorithm prevents abuse
Docker-based sandboxing for untrusted code
Environment variable expansion for secrets
Comprehensive monitoring with web dashboard
YAML-based setup with auto-discovery
config.yaml
commands:
# Local execution
- name: list-files
script: ls
args: ["-la"]
description: "List directory contents"
# Container execution
- name: python-analysis
script: python
args: ["/app/analyze.py"]
container: "python:3.9-slim"
# Webhook/API execution
- name: weather-api
description: "Get current weather"
webhook:
url: "https://api.weather.com/v1/current"
method: "GET"
auth:
type: "bearer"
token: "${WEATHER_API_KEY}"
server:
http:
enabled: true
host: "localhost"
port: 8443
tls:
enabled: true
cert_file: "/path/to/cert.pem"
key_file: "/path/to/key.pem"
auth:
enabled: true
api_keys:
"mcpfier_key_123":
permissions: ["*"]
rate_limit:
enabled: true
requests_per_minute: 60
burst_size: 10
analytics:
enabled: true
database_path: "~/.mcpfier/analytics.db"
Transform internal APIs and workflows into MCP tools
Linting, testing, building, deployment automation
Health checks, log analysis, backup operations
Bridge MCP clients to existing REST APIs and webhooks
Combine local scripts, containerized tools, and external APIs
Enable LLMs to access enterprise systems and external services