xi

xiyan_mcp_server

An MCP server that supports fetching data from a database using natural language queries, powered by XiyanSQL as the text-to-SQL LLM.

Publisherxiyan_mcp_server
Submitted date4/13/2025
<h1 align="center">XiYan MCP Server: Empowering LLMs with Seamless Database Integration</h1> <p align="center"> <a href="https://github.com/XGenerationLab/XiYan-SQL"><img alt="MCP Playwright" src="https://raw.githubusercontent.com/XGenerationLab/XiYan-SQL/main/xiyanGBI.png" height="60"/></a> </p> <p align="center"> <b>Unlocking the Power of Natural Language Database Queries with a Robust Model Context Protocol (MCP) Server</b><br/> <sub>Powered by <a href="https://github.com/XGenerationLab/XiYan-SQL" >XiYan-SQL</a>, the State-of-the-Art Text-to-SQL Solution on Open Benchmarks</sub> </p> <p align="center"> ๐Ÿ’ป <a href="https://github.com/XGenerationLab/xiyan_mcp_server" >XiYan-mcp-server</a> | ๐ŸŒ <a href="https://github.com/XGenerationLab/XiYan-SQL" >XiYan-SQL</a> | ๐Ÿ“– <a href="https://arxiv.org/abs/2411.08599"> Arxiv</a> | ๐Ÿ“„ <a href="https://paperswithcode.com/paper/xiyan-sql-a-multi-generator-ensemble" >PapersWithCode</a> ๐Ÿ’ป <a href="https://huggingface.co/collections/XGenerationLab/xiyansql-models-67c9844307b49f87436808fc">HuggingFace</a> | ๐Ÿค– <a href="https://modelscope.cn/collections/XiYanSQL-Models-4483337b614241" >ModelScope</a> | ๐ŸŒ• <a href="https://bailian.console.aliyun.com/xiyan">ๆž่จ€GBI</a> <br /> <a href="https://arxiv.org/abs/2411.08599"><img src="imgs/Paper-Arxiv-orange.svg" ></a> <a href="https://opensource.org/licenses/Apache-2.0"> <img src="https://img.shields.io/badge/License-Apache%202.0-blue.svg" alt="License: Apache 2.0" /> </a> <a href="https://pepy.tech/projects/xiyan-mcp-server"><img src="https://static.pepy.tech/badge/xiyan-mcp-server" alt="PyPI Downloads"></a> <a href="https://smithery.ai/server/@XGenerationLab/xiyan_mcp_server"><img alt="Smithery Installs" src="https://smithery.ai/badge/@XGenerationLab/xiyan_mcp_server" height="20"/></a> <a href="https://github.com/XGenerationLab/xiyan_mcp_server" target="_blank"> <img src="https://img.shields.io/github/stars/XGenerationLab/xiyan_mcp_server?style=social" alt="GitHub stars" /> </a> <br /> <a href="https://github.com/XGenerationLab/xiyan_mcp_server" >English</a> | <a href="https://github.com/XGenerationLab/xiyan_mcp_server/blob/main/README_zh.md"> ไธญๆ–‡ </a><br /> <a href="https://github.com/XGenerationLab/xiyan_mcp_server/blob/main/imgs/dinggroup_out.png">Ding Group้’‰้’‰็พค</a>๏ฝœ <a href="https://weibo.com/u/2540915670" target="_blank">Follow me on Weibo</a> </p> ## Navigating the XiYan MCP Server Ecosystem - [Key Features](#key-features) - [Deep Dive: Architecture and Best Practices](#deep-dive-architecture-and-best-practices) - [Architectural Overview](#architectural-overview) - [Optimizing Performance: Best Practices for Integration](#optimizing-performance-best-practices-for-integration) - [Tooling Showcase](#tooling-showcase) - [Getting Started: Installation Guide](#getting-started-installation-guide) - [Installation via pip: The Recommended Approach](#installation-via-pip-the-recommended-approach) - [Alternative Installation: Smithery.ai](#alternative-installation-smitheryai) - [Configuration Mastery: Tailoring the Server to Your Needs](#configuration-mastery-tailoring-the-server-to-your-needs) - [LLM Configuration: Choosing the Right Model for the Job](#llm-configuration-choosing-the-right-model-for-the-job) - [Leveraging General-Purpose LLMs (GPT, QwenMax)](#leveraging-general-purpose-llms-gpt-qwenmax) - [Harnessing the Power of Text-to-SQL SOTA Models](#harnessing-the-power-of-text-to-sql-sota-models) - [Deploying Local Models: Enhanced Security and Control](#deploying-local-models-enhanced-security-and-control) - [Database Configuration: Connecting to Your Data Source](#database-configuration-connecting-to-your-data-source) - [Configuring MySQL Connectivity](#configuring-mysql-connectivity) - [Configuring PostgreSQL Connectivity](#configuring-postgresql-connectivity) - [Deployment Strategies: Launching the Server in Various Environments](#deployment-strategies-launching-the-server-in-various-environments) - [Integrating with Claude Desktop](#integrating-with-claude-desktop) - [Integrating with Cline](#integrating-with-cline) - [Integrating with Goose](#integrating-with-goose) - [Integrating with Cursor](#integrating-with-cursor) - [Integrating with Witsy](#integrating-with-witsy) - [Troubleshooting: Addressing Common Issues](#troubleshooting-addressing-common-issues) - [Acknowledging Our Work: Citation Information](#acknowledging-our-work-citation-information) ## Key Features - **Natural Language Data Retrieval:** Seamlessly query databases using natural language through the power of [XiYanSQL](https://github.com/XGenerationLab/XiYan-SQL). - **Versatile LLM Support:** Compatible with general-purpose LLMs (GPT, QwenMax) and specialized Text-to-SQL SOTA models. - **Enhanced Security with Local Mode:** Operate in a completely local environment for maximum data security. - **Broad Database Compatibility:** Supports both MySQL and PostgreSQL databases. - **Dynamic Resource Discovery:** Automatically lists available database tables as accessible resources. - **Data Preview Functionality:** Provides the ability to preview table contents for enhanced context. ## Deep Dive: Architecture and Best Practices ### Architectural Overview The XiYan MCP Server offers two primary integration modes: remote and local. * **Remote Mode (Default):** This mode leverages a cloud-based Text-to-SQL model (e.g., `xiyanSQL-qwencoder-32B`) accessed via an API key. This provides high performance and accuracy. * **Local Mode:** For environments requiring enhanced security, the local mode allows you to run the Text-to-SQL model directly on your infrastructure, eliminating the need for external API calls. ![architecture.png](imgs/architecture.png) ### Optimizing Performance: Best Practices for Integration [Build a local data assistant using MCP + Modelscope API-Inference without writing a single line of code](https://mp.weixin.qq.com/s/tzDelu0W4w6t9C0_yYRbHA) ### Tooling Showcase * **`get_data` Tool:** This tool empowers users to retrieve data from databases using natural language queries. The server intelligently translates these queries into SQL using a built-in model and executes them against the database. * **`{dialect}://{table_name}` Resource:** This resource allows you to retrieve sample data from a specific table, providing valuable context for the LLM. * **`{dialect}://` Resource:** This resource lists the names of all available databases, enabling easy discovery and exploration. ## Getting Started: Installation Guide ### Installation via pip: The Recommended Approach The XiYan MCP Server requires Python 3.11 or higher. The recommended installation method is via `pip`: ```bash pip install xiyan-mcp-server

After installation, you can run the server using:

python -m xiyan_mcp_server

However, the server will not be fully functional until you configure it with a YAML configuration file. A sample configuration file (config_demo.yml) will be generated upon initial execution. To specify the configuration file, use the YML environment variable:

env YML=path/to/yml python -m xiyan_mcp_server

Alternative Installation: Smithery.ai

The XiYan MCP Server is also available on Smithery.ai: @XGenerationLab/xiyan_mcp_server. Please note that this installation method is not as thoroughly tested as the pip installation.

Configuration Mastery: Tailoring the Server to Your Needs

The XiYan MCP Server is configured using a YAML file. A sample configuration file (config_demo.yml) is provided as a starting point.

model: name: "XGenerationLab/XiYanSQL-QwenCoder-32B-2412" key: "" url: "https://api-inference.modelscope.cn/v1/" database: host: "localhost" port: 3306 user: "root" password: "" database: ""

LLM Configuration: Choosing the Right Model for the Job

The model section of the configuration file defines the LLM used for Text-to-SQL conversion.

VersionsGeneral LLMs (GPT, QwenMax)SOTA Model by ModelScopeSOTA Model by DashScopeLocal LLMs
DescriptionBasic, easy to useBest performance, stable, recommendedBest performance, for trialSlow, high-security
NameThe official model name (e.g., gpt-3.5-turbo, qwen-max)XGenerationLab/XiYanSQL-QwenCoder-32B-2412xiyansql-qwencoder-32bxiyansql-qwencoder-3b
KeyThe API key of the service provider (e.g., OpenAI, Alibaba Cloud)The API key of ModelScopeThe API key via email""
URLThe endpoint of the service provider (e.g., "https://api.openai.com/v1")https://api-inference.modelscope.cn/v1/https://xiyan-stream.biz.aliyun.com/service/api/xiyan-sqlhttp://localhost:5090

Leveraging General-Purpose LLMs (GPT, QwenMax)

To use a general-purpose LLM like GPT-3.5, configure the model section as follows:

model: name: "gpt-3.5-turbo" key: "YOUR KEY" url: "https://api.openai.com/v1" database:

For Qwen from Alibaba Cloud:

model: name: "qwen-max" key: "YOUR KEY" url: "https://dashscope.aliyuncs.com/compatible-mode/v1" database:

Harnessing the Power of Text-to-SQL SOTA Models

The XiYanSQL-qwencoder-32B model (https://github.com/XGenerationLab/XiYanSQL-QwenCoder) is the state-of-the-art Text-to-SQL model, achieving top performance on the BIRD benchmark. You can access this model through either ModelScope or Alibaba Cloud DashScope.

(1) ModelScope Version

Obtain an API-Inference key from ModelScope (https://www.modelscope.cn/docs/model-service/API-Inference/intro) and configure the model section:

model: name: "XGenerationLab/XiYanSQL-QwenCoder-32B-2412" key: "" url: "https://api-inference.modelscope.cn/v1/"

Refer to the model description for detailed information.

(2) DashScope Version

To use the DashScope version, you'll need a key. Contact [email protected] with the following information:

name: "YOUR NAME", email: "YOUR EMAIL", organization: "Your College or Company or Organization"

A key will be provided, subject to usage limits and expiration.

model: name: "xiyansql-qwencoder-32b" key: "KEY" url: "https://xiyan-stream.biz.aliyun.com/service/api/xiyan-sql" database:

This service is intended for trial purposes. Contact us for production deployments. Alternatively, you can deploy the XiYanSQL-qwencoder-32B model on your own server.

Deploying Local Models: Enhanced Security and Control

Running the xiyan_mcp_server in local mode provides the highest level of security.

Prerequisites:

  • PC/Mac with at least 16GB RAM
  • 6GB disk space

Steps:

  1. Install Additional Python Packages:

    pip install flask modelscope torch==2.2.2 accelerate>=0.26.0 numpy=2.2.3
  2. (Optional) Manually Download the Model:

    We recommend the xiyansql-qwencoder-3b model (https://www.modelscope.cn/models/XGenerationLab/XiYanSQL-QwenCoder-3B-2502/).

    modelscope download --model XGenerationLab/XiYanSQL-QwenCoder-3B-2502
  3. Run the Local Server Script:

    Execute src/xiyan_mcp_server/local_xiyan_server.py:

    python local_xiyan_server.py

    The server will run on http://localhost:5090/.

  4. Configure config.yml:

    model: name: "xiyansql-qwencoder-3b" key: "KEY" url: "http://127.0.0.1:5090"

Local mode is now ready. Note that local model inference is significantly slower than using a cloud-based service.

Database Configuration: Connecting to Your Data Source

The database section of the configuration file specifies the connection details for your database.

Configuring MySQL Connectivity

database: host: "localhost" port: 3306 user: "root" password: "" database: ""

Configuring PostgreSQL Connectivity

  1. Install Python Packages:

    pip install psycopg2
  2. Configure config.yml:

    database: dialect: "postgresql" host: "localhost" port: 5432 user: "" password: "" database: ""

    Ensure that dialect is set to "postgresql".

Deployment Strategies: Launching the Server in Various Environments

Integrating with Claude Desktop

Add the following to your Claude Desktop configuration file:

{ "mcpServers": { "xiyan-mcp-server": { "command": "python", "args": [ "-m", "xiyan_mcp_server" ], "env": { "YML": "PATH/TO/YML" } } } }

Integrating with Cline

Use the same configuration as Claude Desktop.

Integrating with Goose

Add the following command to the Goose configuration:

env YML=path/to/yml python -m xiyan_mcp_server

Integrating with Cursor

Use the same command as Goose.

Integrating with Witsy

Add the following command:

python -m xiyan_mcp_server

Add an environment variable: YML with the value set to the path of your YAML file.

Troubleshooting: Addressing Common Issues

If you encounter any issues, please contact us:

Ding Group้’‰้’‰็พค๏ฝœ Follow me on Weibo

Acknowledging Our Work: Citation Information

If you find our work helpful, please cite it:

@article{xiyansql, title={A Preview of XiYan-SQL: A Multi-Generator Ensemble Framework for Text-to-SQL}, author={Yingqi Gao and Yifu Liu and Xiaoxia Li and Xiaorong Shi and Yin Zhu and Yiming Wang and Shiqi Li and Wei Li and Yuntao Hong and Zhiling Luo and Jinyang Gao and Liyu Mou and Yu Li}, year={2024}, journal={arXiv preprint arXiv:2411.08599}, url={https://arxiv.org/abs/2411.08599}, primaryClass={cs.AI} }

Visit More

View All