Trino MCP Server in Go
Last updated
Was this helpful?
Last updated
Was this helpful?
A high-performance Model Context Protocol (MCP) server for Trino implemented in Go. This project enables AI assistants to seamlessly interact with Trino's distributed SQL query engine through standardized MCP tools.
This project implements a Model Context Protocol (MCP) server for Trino in Go. It enables AI assistants to access Trino's distributed SQL query engine through standardized MCP tools.
Trino (formerly PrestoSQL) is a powerful distributed SQL query engine designed for fast analytics on large datasets.
✅ MCP server implementation in Go
✅ Trino SQL query execution through MCP tools
✅ Catalog, schema, and table discovery
✅ Docker container support
✅ Supports both STDIO and HTTP transports
✅ Server-Sent Events (SSE) support for Cursor and other MCP clients
✅ Compatible with Cursor, Claude Desktop, Windsurf, ChatWise, and any MCP-compatible clients.
The easiest way to install mcp-trino is using Homebrew:
To update to the latest version:
Place the binary in a directory included in your PATH (e.g., /usr/local/bin
on Linux/macOS)
Make it executable (chmod +x mcp-trino
on Linux/macOS)
You can download pre-built binaries for your platform:
macOS
x86_64 (Intel)
macOS
ARM64 (Apple Silicon)
Linux
x86_64
Linux
ARM64
Windows
x86_64
This MCP server can be integrated with several AI applications:
To use the Docker image instead of a local binary:
Note: The
host.docker.internal
special DNS name allows the container to connect to services running on the host machine. If your Trino server is running elsewhere, replace with the appropriate host.
This Docker configuration can be used in any of the below applications.
Replace the environment variables with your specific Trino configuration.
For HTTP+SSE transport mode (supported for Cursor integration):
Then start the server in a separate terminal with:
macOS: ~/Library/Application Support/Claude/claude_desktop_config.json
Windows: %APPDATA%\Claude\claude_desktop_config.json
After updating the configuration, restart Claude Desktop. You should see the MCP tools available in the tools menu.
Restart Windsurf to apply the changes. The Trino MCP tools will be available to the Cascade AI.
Open ChatWise and go to Settings
Navigate to the Tools section
Click the "+" icon to add a new tool
Select "Command Line MCP"
Configure with the following details:
ID: mcp-trino
(or any name you prefer)
Command: mcp-trino
Args: (leave empty)
Env: Add the following environment variables:
Alternatively, you can import the configuration from JSON:
Copy this JSON to your clipboard:
In ChatWise Settings > Tools, click the "+" icon
Select "Import JSON from Clipboard"
Toggle the switch next to the tool to enable it
Once enabled, click the hammer icon below the input box in ChatWise to access Trino MCP tools.
The server provides the following MCP tools:
Execute a SQL query against Trino with full SQL support for complex analytical queries.
Sample Prompt:
"How many customers do we have per region? Can you show them in descending order?"
Example:
Response:
List all catalogs available in the Trino server, providing a comprehensive view of your data ecosystem.
Sample Prompt:
"What databases do we have access to in our Trino environment?"
Example:
Response:
List all schemas in a catalog, helping you navigate through the data hierarchy efficiently.
Sample Prompt:
"What schemas or datasets are available in the tpch catalog?"
Example:
Response:
List all tables in a schema, giving you visibility into available datasets.
Sample Prompt:
"What tables are available in the tpch tiny schema? I need to know what data we can query."
Example:
Response:
Get the schema of a table, understanding the structure of your data for better query planning.
Sample Prompt:
"What columns are in the customer table? I need to know the data types and structure before writing my query."
Example:
Response:
This information is invaluable for understanding the column names, data types, and nullability constraints before writing queries against the table.
Here's a complete interaction example showing how an AI assistant might use these tools to answer a business question:
User Query: "Can you help me analyze our biggest customers? I want to know the top 5 customers with the highest account balances."
AI Assistant's workflow:
First, discover available catalogs
Then, find available schemas
Explore available tables
Check the customer table schema
Finally, execute the query
Returns the results to the user:
This seamless workflow demonstrates how the MCP tools enable AI assistants to explore and query data in a conversational manner.
The server can be configured using the following environment variables:
TRINO_HOST
Trino server hostname
localhost
TRINO_PORT
Trino server port
8080
TRINO_USER
Trino user
trino
TRINO_PASSWORD
Trino password
(empty)
TRINO_CATALOG
Default catalog
memory
TRINO_SCHEMA
Default schema
default
TRINO_SCHEME
Connection scheme (http/https)
https
TRINO_SSL
Enable SSL
true
TRINO_SSL_INSECURE
Allow insecure SSL
true
TRINO_ALLOW_WRITE_QUERIES
Allow non-read-only SQL queries
false
TRINO_QUERY_TIMEOUT
Query timeout in seconds
30
MCP_TRANSPORT
Transport method (stdio/http)
stdio
MCP_PORT
HTTP port for http transport
9097
MCP_HOST
Host for HTTP callbacks
localhost
Note: When
TRINO_SCHEME
is set to "https",TRINO_SSL
is automatically set to true regardless of the provided value.
Important: The default connection mode is HTTPS. If you're using an HTTP-only Trino server, you must set
TRINO_SCHEME=http
in your environment variables.
Security Note: By default, only read-only queries (SELECT, SHOW, DESCRIBE, EXPLAIN) are allowed to prevent SQL injection. If you need to execute write operations or other non-read queries, set
TRINO_ALLOW_WRITE_QUERIES=true
, but be aware this bypasses this security protection.
For Cursor Integration: When using with Cursor, set
MCP_TRANSPORT=http
and connect to the/sse
endpoint. The server will automatically handle SSE (Server-Sent Events) connections.
Contributions are welcome! Please feel free to submit a Pull Request.
This project is licensed under the MIT License - see the LICENSE file for details.
This project uses GitHub Actions for continuous integration and GoReleaser for automated releases.
Our CI pipeline performs the following checks on all PRs and commits to the main branch:
Linting: Using golangci-lint to check for common code issues and style violations
Go Module Verification: Ensuring go.mod and go.sum are properly maintained
Formatting: Verifying code is properly formatted with gofmt
Vulnerability Scanning: Using govulncheck to check for known vulnerabilities in dependencies
Dependency Scanning: Using Trivy to scan for vulnerabilities in dependencies (CRITICAL, HIGH, and MEDIUM)
SBOM Generation: Creating a Software Bill of Materials for dependency tracking
SLSA Provenance: Creating verifiable build provenance for supply chain security
Unit Tests: Running tests with race detection and code coverage reporting
Build Verification: Ensuring the codebase builds successfully
Least Privilege: Workflows run with minimum required permissions
Pinned Versions: All GitHub Actions use specific versions to prevent supply chain attacks
Dependency Updates: Automated dependency updates via Dependabot
When changes are merged to the main branch:
CI checks are run to validate code quality and security
If successful, a new release is automatically created with:
Semantic versioning based on commit messages
Binary builds for multiple platforms
Docker image publishing to GitHub Container Registry
SBOM and provenance attestation
Download the appropriate binary for your platform from the page.
Or see all available downloads on the page.
To use with , create or edit ~/.cursor/mcp.json
:
To use with , edit your Claude configuration file:
To use with , create or edit your mcp_config.json
:
To use with , follow these steps: