Use MCP database servers to let AI assistants query PostgreSQL and SQLite databases using natural language. The AI translates questions into SQL, executes queries via MCP tools, and returns formatted results. Configure the PostgreSQL or SQLite MCP server with read-only access, connect it to Claude Desktop or Cursor, and ask questions about your data without writing SQL manually.
Querying Databases with Natural Language via MCP
MCP database servers let AI assistants directly query your PostgreSQL and SQLite databases. Instead of writing SQL, you ask natural language questions and the AI generates the right queries, executes them, and formats the results. This tutorial covers setting up community MCP database servers, configuring them with read-only access for safety, and building custom query tools for your specific database schema.
Prerequisites
- A PostgreSQL or SQLite database with data to query
- Claude Desktop or Cursor with MCP support
- Database connection string or file path
- Basic understanding of SQL concepts
Step-by-step guide
Configure the PostgreSQL MCP server
Configure the PostgreSQL MCP server
The community PostgreSQL MCP server connects to your database and provides tools for listing tables, describing schemas, and executing queries. Add it to your MCP client configuration with the database connection string. Use a read-only database user to prevent accidental modifications. The server exposes tools like query, list_tables, and describe_table.
1// Claude Desktop config:2{3 "mcpServers": {4 "postgres": {5 "command": "npx",6 "args": ["-y", "@modelcontextprotocol/server-postgres",7 "postgresql://readonly_user:password@localhost:5432/mydb"]8 }9 }10}1112// Create a read-only PostgreSQL user:13// CREATE ROLE readonly_user LOGIN PASSWORD 'secure_password';14// GRANT CONNECT ON DATABASE mydb TO readonly_user;15// GRANT USAGE ON SCHEMA public TO readonly_user;16// GRANT SELECT ON ALL TABLES IN SCHEMA public TO readonly_user;17// ALTER DEFAULT PRIVILEGES IN SCHEMA public GRANT SELECT ON TABLES TO readonly_user;Expected result: PostgreSQL MCP server connected and providing table listing and query tools.
Configure the SQLite MCP server for local databases
Configure the SQLite MCP server for local databases
For local SQLite databases, use the SQLite MCP server. Point it at your database file. SQLite is simpler to set up — no user management needed. Open the file in read-only mode by appending ?mode=ro to the file path to prevent modifications.
1// Claude Desktop config:2{3 "mcpServers": {4 "sqlite": {5 "command": "npx",6 "args": ["-y", "@modelcontextprotocol/server-sqlite",7 "/path/to/your/database.db"]8 }9 }10}Expected result: SQLite MCP server connected and ready to query the local database.
Ask natural language questions about your data
Ask natural language questions about your data
Once the MCP database server is connected, simply ask questions in natural language. The AI will discover the database schema (tables, columns, types), generate appropriate SQL, execute it via the MCP tool, and format the results. Start with simple questions to verify the connection, then move to complex queries involving joins and aggregations.
1// Example natural language queries you can ask:23// Simple: "What tables are in this database?"4// The AI calls list_tables tool56// Schema: "Describe the users table"7// The AI calls describe_table with table name89// Query: "How many users signed up this month?"10// The AI generates: SELECT COUNT(*) FROM users WHERE created_at >= '2026-03-01'1112// Join: "Show me the top 10 customers by total order value"13// The AI generates: SELECT u.name, SUM(o.total) as total_value14// FROM users u JOIN orders o ON u.id = o.user_id15// GROUP BY u.id ORDER BY total_value DESC LIMIT 101617// Complex: "What's the month-over-month growth rate for new signups?"18// The AI generates a window function query with LAG()Expected result: The AI translates natural language questions into SQL, executes them, and returns formatted results.
Build a custom database MCP server with schema-aware tools
Build a custom database MCP server with schema-aware tools
For production use, build a custom MCP server that understands your specific database schema. Define tools with clear descriptions that reference your actual tables and columns. Add safety constraints like query timeouts, result limits, and blocked operations (DELETE, DROP, UPDATE). This gives the AI better context than generic database tools.
1// src/db-server.ts2import { McpServer } from "@modelcontextprotocol/sdk/server/mcp.js";3import { StdioServerTransport } from "@modelcontextprotocol/sdk/server/stdio.js";4import { z } from "zod";5import pg from "pg";67const pool = new pg.Pool({ connectionString: process.env.DATABASE_URL });8const server = new McpServer({ name: "custom-db-server", version: "1.0.0" });910// Safety: block destructive queries11const BLOCKED_PATTERNS = /\b(DELETE|DROP|TRUNCATE|UPDATE|INSERT|ALTER|CREATE)\b/i;1213server.tool(14 "query_database",15 "Execute a read-only SQL query. Available tables: users (id, email, name, created_at), orders (id, user_id, total, status, created_at), products (id, name, price, category).",16 {17 sql: z.string().describe("SELECT query to execute"),18 limit: z.number().default(100).describe("Max rows to return"),19 },20 async ({ sql, limit }) => {21 if (BLOCKED_PATTERNS.test(sql)) {22 return { content: [{ type: "text", text: "Error: Only SELECT queries are allowed." }], isError: true };23 }2425 try {26 const limitedSql = sql.includes("LIMIT") ? sql : `${sql} LIMIT ${limit}`;27 const result = await pool.query({ text: limitedSql, timeout: 5000 });28 return {29 content: [{ type: "text", text: JSON.stringify({30 rowCount: result.rowCount,31 columns: result.fields.map(f => f.name),32 rows: result.rows,33 }, null, 2) }],34 };35 } catch (error) {36 return {37 content: [{ type: "text", text: `Query error: ${error instanceof Error ? error.message : String(error)}` }],38 isError: true,39 };40 }41 }42);4344server.tool("get_schema", "Get the database schema for all tables", {}, async () => {45 const result = await pool.query(46 `SELECT table_name, column_name, data_type, is_nullable47 FROM information_schema.columns48 WHERE table_schema = 'public'49 ORDER BY table_name, ordinal_position`50 );51 return { content: [{ type: "text", text: JSON.stringify(result.rows, null, 2) }] };52});5354async function main() {55 const transport = new StdioServerTransport();56 await server.connect(transport);57 console.error("Custom database MCP server running");58}59main().catch(e => { console.error(e); process.exit(1); });Expected result: A custom MCP server with safety-constrained query execution and schema discovery tools.
Complete working example
1import { McpServer } from "@modelcontextprotocol/sdk/server/mcp.js";2import { StdioServerTransport } from "@modelcontextprotocol/sdk/server/stdio.js";3import { z } from "zod";4import pg from "pg";56const pool = new pg.Pool({7 connectionString: process.env.DATABASE_URL,8 max: 5,9 idleTimeoutMillis: 30000,10});1112const BLOCKED = /\b(DELETE|DROP|TRUNCATE|UPDATE|INSERT|ALTER|CREATE|GRANT|REVOKE)\b/i;1314const server = new McpServer({ name: "db-query-server", version: "1.0.0" });1516server.tool("get_schema", "List all tables and their columns", {}, async () => {17 const r = await pool.query(18 `SELECT table_name, column_name, data_type, is_nullable19 FROM information_schema.columns WHERE table_schema = 'public'20 ORDER BY table_name, ordinal_position`21 );22 return { content: [{ type: "text", text: JSON.stringify(r.rows, null, 2) }] };23});2425server.tool("query", "Run a read-only SQL query against the database", {26 sql: z.string().describe("SQL SELECT query"),27 limit: z.number().default(100),28}, async ({ sql, limit }) => {29 if (BLOCKED.test(sql)) {30 return { content: [{ type: "text", text: "Error: Only SELECT queries allowed" }], isError: true };31 }32 try {33 const q = sql.toLowerCase().includes("limit") ? sql : `${sql} LIMIT ${limit}`;34 const r = await pool.query({ text: q, timeout: 10000 });35 const out = {36 rows: r.rowCount,37 columns: r.fields.map(f => f.name),38 data: r.rows,39 };40 return { content: [{ type: "text", text: JSON.stringify(out, null, 2) }] };41 } catch (e) {42 return { content: [{ type: "text", text: `Error: ${e instanceof Error ? e.message : e}` }], isError: true };43 }44});4546server.tool("table_stats", "Get row counts for all tables", {}, async () => {47 const r = await pool.query(48 `SELECT schemaname, relname AS table, n_live_tup AS row_count49 FROM pg_stat_user_tables ORDER BY n_live_tup DESC`50 );51 return { content: [{ type: "text", text: JSON.stringify(r.rows, null, 2) }] };52});5354async function main() {55 const transport = new StdioServerTransport();56 await server.connect(transport);57 console.error("Database MCP server running");58}59main().catch(e => { console.error(e); process.exit(1); });Common mistakes when using MCP to query databases from AI assistants
Why it's a problem: Giving the MCP database server write access, risking accidental data modification
How to avoid: Always use a read-only database user. Block DELETE, UPDATE, INSERT, and DDL statements in the tool handler.
Why it's a problem: Not setting query timeouts, allowing expensive queries to run indefinitely
How to avoid: Set a query timeout (5-10 seconds) and a LIMIT clause on all queries to prevent runaway resource usage.
Why it's a problem: Exposing sensitive data like passwords or PII through unrestricted queries
How to avoid: Limit which tables the read-only user can access. Exclude sensitive columns from the schema tool output.
Why it's a problem: Not including table schemas in tool descriptions, forcing extra round trips
How to avoid: List key tables and columns in the query tool description so the AI can generate correct queries on the first try.
Best practices
- Use read-only database credentials for all MCP database servers
- Block destructive SQL patterns (DELETE, DROP, UPDATE) at the tool handler level
- Set query timeouts to prevent expensive queries from consuming resources
- Add automatic LIMIT clauses to prevent returning millions of rows
- Include table names and key columns in tool descriptions for better AI query generation
- Expose a schema discovery tool so the AI can understand the database structure
- Log all executed queries to stderr for auditing and debugging
Still stuck?
Copy one of these prompts to get a personalized, step-by-step explanation.
I want to query my PostgreSQL database using natural language via MCP. Show me how to set up the PostgreSQL MCP server with a read-only user, build a custom MCP server that blocks destructive queries, and configure Claude Desktop to connect.
Set up a database MCP server that lets me query PostgreSQL from Claude Desktop. Include read-only safety checks, query timeouts, automatic LIMIT, schema discovery, and table stats tools. Use @modelcontextprotocol/sdk.
Frequently asked questions
Is it safe to connect an AI to my production database?
Only with strict safeguards: read-only credentials, query timeouts, blocked destructive statements, and limited table access. Never use admin or write-capable credentials for MCP database servers.
Can the AI handle complex joins and aggregations?
Yes. Modern LLMs like Claude and GPT-4 are excellent at generating SQL for complex queries including joins, subqueries, window functions, and CTEs. Providing the schema helps accuracy significantly.
How do I prevent the AI from querying sensitive tables?
Create a database user that only has SELECT permission on non-sensitive tables. Alternatively, block specific table names in the tool handler's SQL validation.
Can RapidDev help set up secure database access via MCP?
Yes. RapidDev configures production-ready MCP database servers with proper security, monitoring, and access controls for teams that need AI-powered data access.
Does this work with MySQL or MongoDB?
The pattern works with any database. For MySQL, use a MySQL MCP server or build a custom one with the mysql2 npm package. For MongoDB, build a custom server with the mongodb driver.
Talk to an Expert
Our team has built 600+ apps. Get personalized help with your project.
Book a free consultation