A comprehensive Model Context Protocol (MCP) server for CData Sync with dual transport support. This server exposes CData Sync's REST API as MCP tools, enabling AI assistants like Claude to manage data synchronization jobs, connections, and ETL operations.
Transport Options:
- stdio - For desktop usage with Claude Desktop app
- HTTP - For remote server deployments and API access
- π§ 20 Consolidated MCP Tools - Streamlined read/write operations for all entity types
- π Dual Transport Support - Both stdio (Claude Desktop) and Streamable HTTP (web clients)
- π‘ Real-time Notifications - Live monitoring of job executions and API calls via Server-Sent Events
- ποΈ Production-Ready Architecture - TypeScript, error handling, logging, and comprehensive type safety
- π Multiple Auth Methods - Support for API tokens and basic authentication
- π Web Client Support - RESTful HTTP API with streaming capabilities
- π Job Management - Execute, monitor, and control data sync jobs
- π Connection Management - Test, create, and manage data connections
- π₯ User Management - Handle user accounts and permissions
- π History & Logging - Access execution history and detailed logs
- Node.js 18+
- CData Sync instance running
- Claude Desktop (for stdio transport) or web browser (for HTTP transport)
-
Clone the repository
git clone https://github.com/yourusername/cdata-sync-mcp-server.git cd cdata-sync-mcp-server
-
Install dependencies
npm install
-
Build the project
npm run build
-
Configure environment variables
# Copy the example environment file cp .env.example .env # Edit with your CData Sync details CDATA_BASE_URL="http://localhost:8181/api.rsc" CDATA_AUTH_TOKEN="your-auth-token" MCP_TRANSPORT_MODE="both" # stdio, http, or both
The stdio transport is designed for local desktop usage with the Claude Desktop app. This is the recommended approach for individual developers.
Configuration for Claude Desktop:
{
"mcpServers": {
"cdata-sync-server": {
"command": "node",
"args": ["/absolute/path/to/cdata-sync-mcp-server/dist/index.js"],
"env": {
"MCP_TRANSPORT_MODE": "stdio",
"CDATA_AUTH_TOKEN": "your-token-here",
"CDATA_BASE_URL": "http://localhost:8181/api.rsc",
"DISABLE_SSE": "true"
}
}
}
}
Start stdio-only server:
npm run start:stdio
The HTTP transport is designed for server deployments where the MCP server runs on a remote machine and accepts API requests. This is ideal for:
- Team deployments
- Docker/Kubernetes environments
- Integration with web applications
- Remote access scenarios
Start HTTP-only server:
npm run start:http
Available endpoints:
GET /mcp/v1/info
- Server and protocol informationGET /mcp/v1/health
- Health checkPOST /mcp/v1/message
- Send MCP requestsGET /mcp/v1/stream
- Server-Sent Events for real-time updates
Example HTTP client usage:
// Connect to the server
const client = new MCPStreamableHttpClient('http://your-server:3000/mcp/v1');
await client.connect();
// List available tools
const tools = await client.listTools();
// Call a tool
const connections = await client.callTool('read_connections', {
action: 'list',
top: 5
});
// Set up real-time monitoring
client.onNotification = (method, params) => {
console.log('Notification:', method, params);
};
For development and testing, you can run both transports simultaneously:
npm run start:both
This is useful for testing both desktop and server scenarios during development.
read_connections
- List, count, get details, or test connectionswrite_connections
- Create, update, or delete connectionsget_connection_tables
- List tables in connectionget_table_columns
- Get table schema information
read_jobs
- List, count, get details, status, history, or logswrite_jobs
- Create, update, or delete jobsexecute_job
- Run a sync job immediatelycancel_job
- Stop running jobexecute_query
- Run custom SQL queries
read_tasks
- List, count, or get task detailswrite_tasks
- Create, update, or delete tasks
read_transformations
- List, count, or get transformation detailswrite_transformations
- Create, update, or delete transformations
read_users
- List, count, or get user detailswrite_users
- Create or update users
read_requests
- List, count, or get request log detailswrite_requests
- Delete request logs
read_history
- List or count execution history records
read_certificates
- List certificateswrite_certificates
- Create certificates
configure_sync_server
- Get or update server configuration
All read/write tools use an action
parameter to specify the operation:
Example: Reading connections
{
"tool": "read_connections",
"arguments": {
"action": "list",
"filter": "contains(Name,'prod')",
"top": 10
}
}
Example: Creating a connection
{
"tool": "write_connections",
"arguments": {
"action": "create",
"name": "MyDatabase",
"providerName": "System.Data.SqlClient",
"connectionString": "Server=localhost;Database=test;"
}
}
The HTTP transport provides real-time notifications for:
- Tool execution start/completion
- Job execution progress
- Configuration changes
- Error notifications
// Monitor all server events
const eventSource = new EventSource('http://localhost:3000/mcp/v1/stream');
eventSource.onmessage = (event) => {
const message = JSON.parse(event.data);
if (message.method === 'notifications/job_executed') {
console.log('Job completed:', message.params);
}
};
# Start in development mode with both transports
npm run dev:both
# Start with stdio only
npm run dev:stdio
# Start with HTTP only
npm run dev:http
# Type checking
npm run typecheck
# Linting
npm run lint
npm run lint:fix
# Testing
npm test
npm run test:watch
npm run test:coverage
Variable | Description | Default |
---|---|---|
CDATA_BASE_URL |
CData Sync API base URL | http://localhost:8181/api.rsc |
CDATA_AUTH_TOKEN |
API authentication token | - |
CDATA_USERNAME |
Basic auth username (alternative to token) | - |
CDATA_PASSWORD |
Basic auth password (alternative to token) | - |
MCP_TRANSPORT_MODE |
Transport mode: stdio , http , or both |
stdio |
MCP_HTTP_PORT |
HTTP transport port | 3000 |
MCP_HTTP_PATH |
HTTP transport base path | /mcp/v1 |
NODE_ENV |
Node environment | production |
LOG_LEVEL |
Logging level | info |
# Build image
docker build -t cdata-sync-mcp-server .
# Run with stdio transport
docker run -e CDATA_AUTH_TOKEN=your-token cdata-sync-mcp-server
# Run with HTTP transport
docker run -p 3000:3000 -e MCP_TRANSPORT_MODE=http -e CDATA_AUTH_TOKEN=your-token cdata-sync-mcp-server
# Start with Docker Compose
docker-compose up -d cdata-sync-mcp-both
# Deploy to Kubernetes
kubectl apply -f k8s/
# Install as systemd service
sudo cp cdata-sync-mcp.service /etc/systemd/system/
sudo systemctl enable cdata-sync-mcp
sudo systemctl start cdata-sync-mcp
GET /mcp/v1/info
{
"protocol": "Model Context Protocol",
"version": "2025-03-26",
"transport": "streamable-http",
"endpoints": {
"message": "http://localhost:3000/mcp/v1/message",
"stream": "http://localhost:3000/mcp/v1/stream"
}
}
GET /mcp/v1/health
{
"status": "healthy",
"transport": "streamable-http",
"timestamp": "2024-01-15T10:30:00Z",
"pendingRequests": 0,
"bufferedMessages": 0
}
POST /mcp/v1/message
{
"jsonrpc": "2.0",
"id": "1",
"method": "tools/call",
"params": {
"name": "read_connections",
"arguments": {
"action": "list",
"top": 5
}
}
}
GET /mcp/v1/stream
Server-Sent Events stream providing real-time notifications:
data: {"jsonrpc":"2.0","method":"notifications/tool_execution","params":{"tool":"read_connections","timestamp":"2024-01-15T10:30:00Z"}}
data: {"jsonrpc":"2.0","method":"notifications/job_executed","params":{"jobName":"TestJob","result":"success","timestamp":"2024-01-15T10:31:00Z"}}
# Run all tests
npm test
# Run with coverage
npm run test:coverage
# Watch mode for development
npm run test:watch
src/
βββ __tests__/
β βββ services/ # Service unit tests
β βββ transport/ # Transport tests
β βββ integration/ # Integration tests
β βββ utils/ # Utility tests
- Fork the repository
- Create your feature branch (
git checkout -b feature/amazing-feature
) - Commit your changes (
git commit -m 'Add some amazing feature'
) - Push to the branch (
git push origin feature/amazing-feature
) - Open a Pull Request
This project is licensed under the MIT License - see the LICENSE file for details.
- Documentation: Full API documentation available in the docs directory
- Issues: Report bugs and request features via GitHub Issues
- Discussions: Community support via GitHub Discussions
Built with β€οΈ for the MCP ecosystem