Model Context Protocol Server for Dify AI. This server enables LLMs to interact with Dify AI's chat completion capabilities through a standardized protocol.
- Integration with Dify AI chat completion API
- Restaurant recommendation tool (meshi-doko)
- Support for conversation context
- Streaming response support
- TypeScript implementation
# Build the Docker image
make docker
# Run with Docker
docker run -i --rm mcp/dify https://your-dify-api-endpoint your-dify-api-key
Add the following configuration to your claude_desktop_config.json
:
{
"mcpServers": {
"dify": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-dify",
"https://your-dify-api-endpoint",
"your-dify-api-key"
]
}
}
}
Replace your-dify-api-endpoint
and your-dify-api-key
with your actual Dify API credentials.
Restaurant recommendation tool that interfaces with Dify AI:
Parameters:
LOCATION
(string): Location of the restaurantBUDGET
(string): Budget constraintsquery
(string): Query to send to Dify AIconversation_id
(string, optional): For maintaining chat context
# Initial setup
make setup
# Build the project
make build
# Format code
make format
# Run linter
make lint
This project is released under the MIT License.
This server interacts with Dify AI using your provided API key. Ensure to:
- Keep your API credentials secure
- Use HTTPS for the API endpoint
- Never commit API keys to version control
Contributions are welcome! Please feel free to submit a Pull Request.