MCP Integration
The Model Context Protocol (MCP) is an open standard that lets AI assistants connect to external tools and data sources. Nilux AI supports MCP servers, allowing you to extend its capabilities with custom tools — database queries, API integrations, internal service access, and anything else your workflow needs.
What is MCP?
MCP (Model Context Protocol) defines a standardized way for AI models to interact with external tools. An MCP server exposes a set of tools that the AI can discover and call. The protocol handles:
- Tool discovery — The AI asks "what tools are available?" and receives a list with descriptions and parameter schemas
- Tool invocation — The AI calls a tool with parameters and receives results
- Transport — Communication between the AI client and the MCP server
Supported Transports
Nilux AI supports three MCP transport types:
stdio Transport
The MCP server runs as a child process, communicating over standard input/output.
{
"mcpServers": {
"my-tools": {
"command": "node",
"args": ["./mcp-server.js"],
"env": {
"API_KEY": "sk_..."
}
}
}
}
Best for: Local tools, scripts, and lightweight servers that run on the same machine.
SSE Transport
The MCP server communicates over Server-Sent Events (SSE), typically via HTTP.
{
"mcpServers": {
"remote-api": {
"url": "https://my-mcp-server.example.com/sse",
"headers": {
"Authorization": "Bearer sk_live_..."
}
}
}
}
Best for: Remote tools, shared services, and tools that need persistent connections.
HTTP Transport
The MCP server exposes tools via standard HTTP endpoints (JSON-RPC).
{
"mcpServers": {
"internal-db": {
"url": "http://localhost:8080/mcp",
"transport": "http"
}
}
}
Best for: Existing services with REST APIs that implement the MCP interface.
Configuration
MCP servers are configured in ~/.nilux/config.json under the mcpServers key:
{
"api_key": "sk_live_xxxxxxxxxxxxxxxxxxxxxxxx",
"model": "standard",
"mcpServers": {
"filesystem": {
"command": "npx",
"args": ["-y", "@anthropic/mcp-filesystem", "/path/to/allowed/dir"]
},
"github": {
"command": "npx",
"args": ["-y", "@anthropic/mcp-github"],
"env": {
"GITHUB_TOKEN": "ghp_..."
}
},
"postgres": {
"command": "npx",
"args": ["-y", "@anthropic/mcp-postgres", "postgresql://localhost/mydb"]
},
"custom-api": {
"url": "https://api.mycompany.com/mcp",
"headers": {
"Authorization": "Bearer internal_token_xyz"
}
}
}
}
Each server entry requires:
| Field | Description |
|---|---|
command | (stdio) The executable to run |
args | (stdio) Command-line arguments |
env | (stdio) Environment variables for the process |
url | (SSE/HTTP) The server URL |
headers | (SSE/HTTP) HTTP headers for requests |
transport | (HTTP) Explicit transport type |
Example: Filesystem MCP Server
The filesystem MCP server gives Nilux AI access to a specific directory outside your project:
# Install
npm install -g @anthropic/mcp-filesystem
// ~/.nilux/config.json
{
"mcpServers": {
"filesystem": {
"command": "npx",
"args": ["-y", "@anthropic/mcp-filesystem", "/home/user/documents"]
}
}
}
Now Nilux AI can read, write, and list files in /home/user/documents through MCP tools.
Example: GitHub MCP Server
{
"mcpServers": {
"github": {
"command": "npx",
"args": ["-y", "@anthropic/mcp-github"],
"env": {
"GITHUB_TOKEN": "ghp_your_token_here"
}
}
}
}
Available tools include creating issues, searching repositories, reading pull requests, and more.
Example: Database MCP Server
{
"mcpServers": {
"postgres": {
"command": "npx",
"args": ["-y", "@anthropic/mcp-postgres", "postgresql://user:pass@localhost:5432/mydb"]
}
}
}
The agent can now query your database, explore table schemas, and analyze data — all through a controlled MCP interface.
Warning: MCP servers have access to the resources you configure. Only grant access to directories, databases, and services that you trust the AI to interact with. Review tool calls before approving them.
How Nilux Uses MCP Tools
When Nilux AI starts, it connects to all configured MCP servers and discovers their tools. These tools appear alongside built-in tools in the agent's tool palette:
Available tools:
• Read, Write, Edit, Glob, Grep, Bash (built-in)
• mcp__filesystem__read_file (MCP: filesystem)
• mcp__filesystem__write_file (MCP: filesystem)
• mcp__github__create_issue (MCP: github)
• mcp__github__search_repositories (MCP: github)
• mcp__postgres__query (MCP: postgres)
The agent can then invoke any MCP tool just like a built-in tool:
> Create a GitHub issue for the login bug
Agent invokes mcp__github__create_issue:
repo: myorg/myrepo
title: "Login fails with 500 when email contains + symbol"
body: "Steps to reproduce:\n1. Register with email user+test@example.com\n..."
Building Your Own MCP Server
MCP servers can be written in any language. The protocol uses JSON-RPC over the chosen transport. A minimal MCP server:
# minimal_mcp_server.py
import sys
import json
def handle_request(request):
method = request.get("method")
if method == "tools/list":
return {
"tools": [
{
"name": "get_weather",
"description": "Get current weather for a city",
"inputSchema": {
"type": "object",
"properties": {
"city": {"type": "string"}
},
"required": ["city"]
}
}
]
}
if method == "tools/call":
params = request.get("params", {})
tool_name = params.get("name")
args = params.get("arguments", {})
if tool_name == "get_weather":
return {
"content": [
{"type": "text", "text": f"Weather in {args['city']}: 22C, sunny"}
]
}
return {"error": "Unknown method"}
# Simple stdio transport loop
for line in sys.stdin:
request = json.loads(line)
response = handle_request(request)
sys.stdout.write(json.dumps(response) + "\n")
sys.stdout.flush()
Then configure it:
{
"mcpServers": {
"weather": {
"command": "python",
"args": ["./minimal_mcp_server.py"]
}
}
}
Tip: Use the official MCP SDK for your language for a more robust implementation. SDKs handle transport, error handling, and protocol compliance.
Security Considerations
- Scope access — Only expose the directories, databases, and APIs the agent actually needs
- Use environment variables — Never hardcode credentials in config; use
envfor API keys and tokens - Review tool calls — MCP tool calls appear in the approval flow alongside built-in tools
- Validate inputs — Your MCP server should validate all parameters from the AI before executing operations
- Rate limit — Consider adding rate limits to MCP servers that call external APIs
Troubleshooting
Server fails to start
Check that the command is in your PATH, all arguments are correct, and required environment variables are set.
Tools not appearing
Verify the MCP server responds to tools/list correctly. Check Nilux AI startup logs for connection errors.
Connection timeouts
For SSE/HTTP transports, ensure the server is reachable and not blocked by a firewall. Check the URL and any required authentication headers.
Server crashes during tool call
Check the server logs for exceptions. Ensure your server handles malformed or unexpected tool parameters gracefully.
Next Steps
- Tools Overview — Built-in tools
- Agents Overview — Agent system
- Configuration — Full configuration reference
- Skills — Reusable prompt templates