Build an MCP server that gives AI clients safe access to a PostgreSQL database. Use connection pooling with pg, expose tools for running parameterized queries, and register resources for schema inspection. Always use parameterized queries to prevent SQL injection and restrict to read-only operations unless write access is explicitly needed.
Building an MCP Server with Database Access
Connecting an MCP server to a database lets AI clients query data, inspect schemas, and perform data operations through natural language. This is one of the most powerful MCP patterns — the AI can answer questions about your data without you writing custom queries.
However, database access requires careful security design. AI models can generate unexpected queries, so you need parameterized queries to prevent SQL injection, read-only restrictions to prevent accidental data modification, and connection pooling to prevent resource exhaustion. This tutorial builds a production-ready PostgreSQL MCP server step by step.
Prerequisites
- PostgreSQL 13+ running locally or remotely
- MCP TypeScript SDK installed (@modelcontextprotocol/sdk)
- pg (node-postgres) package installed
- A database with tables and data to query
- Basic SQL knowledge
Step-by-step guide
Set up connection pooling with pg
Set up connection pooling with pg
Create a PostgreSQL connection pool that manages connections efficiently. Connection pooling is critical for MCP servers because each tool call may execute a query, and creating a new connection per call would quickly exhaust database resources. Configure the pool with reasonable limits and load the connection string from an environment variable.
1// TypeScript2import { Pool } from "pg";34const pool = new Pool({5 connectionString: process.env.DATABASE_URL,6 max: 10, // max connections in pool7 idleTimeoutMillis: 30000, // close idle connections after 30s8 connectionTimeoutMillis: 5000, // fail if can't connect in 5s9});1011// Test connection on startup12pool.query("SELECT 1").then(() => {13 console.error("Database connected");14}).catch((err) => {15 console.error("Database connection failed:", err.message);16 process.exit(1);17});Expected result: A connection pool ready to handle concurrent queries from MCP tool calls.
Create a read-only query tool with parameterized queries
Create a read-only query tool with parameterized queries
Register a tool that accepts SQL queries from the AI but restricts them to SELECT statements. Validate that the query starts with SELECT before executing. Always use parameterized queries when the tool accepts user-provided values to prevent SQL injection. Return results as formatted JSON.
1// TypeScript2import { McpServer } from "@modelcontextprotocol/sdk/server/mcp.js";3import { z } from "zod";45const server = new McpServer({ name: "postgres-server", version: "1.0.0" });67server.registerTool("query", {8 description: "Run a read-only SQL query against the database. Only SELECT statements are allowed.",9 inputSchema: {10 sql: z.string().describe("SQL SELECT query to execute"),11 },12}, async ({ sql }) => {13 const trimmed = sql.trim();1415 // Enforce read-only16 if (!/^SELECT\b/i.test(trimmed)) {17 return {18 content: [{ type: "text", text: "Error: Only SELECT queries are allowed. Use INSERT, UPDATE, or DELETE tools for write operations." }],19 isError: true,20 };21 }2223 // Block dangerous patterns24 if (/;\s*(DROP|DELETE|UPDATE|INSERT|ALTER|CREATE|TRUNCATE)/i.test(trimmed)) {25 return {26 content: [{ type: "text", text: "Error: Multi-statement queries with write operations are not allowed." }],27 isError: true,28 };29 }3031 try {32 const result = await pool.query(trimmed);33 return {34 content: [{35 type: "text",36 text: JSON.stringify({37 rowCount: result.rowCount,38 rows: result.rows.slice(0, 100), // Limit response size39 }, null, 2),40 }],41 };42 } catch (error) {43 console.error("Query error:", error);44 return {45 content: [{ type: "text", text: `Error: ${(error as Error).message}` }],46 isError: true,47 };48 }49});Expected result: The AI can run SELECT queries and receive JSON-formatted results, but cannot execute write operations.
Expose database schema as a resource
Expose database schema as a resource
Register an MCP resource that returns your database schema. This gives the AI the context it needs to write correct queries without you listing every table and column in tool descriptions. The schema resource should include table names, column names, types, and relationships.
1// TypeScript2server.registerResource("database-schema", "db://schema", {3 description: "Complete database schema with tables, columns, and types",4 mimeType: "text/plain",5}, async (uri) => {6 const result = await pool.query(`7 SELECT8 t.table_name,9 c.column_name,10 c.data_type,11 c.is_nullable,12 c.column_default13 FROM information_schema.tables t14 JOIN information_schema.columns c15 ON t.table_name = c.table_name AND t.table_schema = c.table_schema16 WHERE t.table_schema = 'public'17 ORDER BY t.table_name, c.ordinal_position18 `);1920 // Format as readable text21 const tables = new Map<string, string[]>();22 for (const row of result.rows) {23 const cols = tables.get(row.table_name) || [];24 cols.push(` ${row.column_name} ${row.data_type}${row.is_nullable === 'NO' ? ' NOT NULL' : ''}${row.column_default ? ` DEFAULT ${row.column_default}` : ''}`);25 tables.set(row.table_name, cols);26 }2728 const schema = Array.from(tables.entries())29 .map(([table, cols]) => `TABLE ${table}:\n${cols.join('\n')}`)30 .join('\n\n');3132 return {33 contents: [{ uri: uri.href, text: schema }],34 };35});Expected result: The AI reads the schema resource to understand your database structure before writing queries.
Add a table listing tool for quick discovery
Add a table listing tool for quick discovery
A lightweight tool that lists all tables with row counts helps the AI quickly understand the database without reading the full schema. This is faster for initial exploration and costs less context than the full schema resource.
1// TypeScript2server.registerTool("list-tables", {3 description: "List all tables in the database with row counts",4 inputSchema: {},5}, async () => {6 try {7 const result = await pool.query(`8 SELECT9 schemaname,10 relname AS table_name,11 n_live_tup AS approximate_row_count12 FROM pg_stat_user_tables13 ORDER BY n_live_tup DESC14 `);15 return {16 content: [{ type: "text", text: JSON.stringify(result.rows, null, 2) }],17 };18 } catch (error) {19 return {20 content: [{ type: "text", text: `Error: ${(error as Error).message}` }],21 isError: true,22 };23 }24});Expected result: The AI gets a quick overview of all tables and their sizes before diving into specific queries.
Add a describe-table tool for column details
Add a describe-table tool for column details
Give the AI a tool to inspect a specific table's columns, types, constraints, and indexes. This is more efficient than loading the entire schema when the AI only needs details about one table. For teams building database MCP servers for production analytics, RapidDev can help design the query layer with proper indexing and access control.
1// TypeScript2server.registerTool("describe-table", {3 description: "Get detailed column information for a specific table",4 inputSchema: {5 tableName: z.string().regex(/^[a-zA-Z_][a-zA-Z0-9_]*$/).describe("Table name (alphanumeric and underscores only)"),6 },7}, async ({ tableName }) => {8 try {9 const columns = await pool.query(10 `SELECT column_name, data_type, is_nullable, column_default11 FROM information_schema.columns12 WHERE table_schema = 'public' AND table_name = $113 ORDER BY ordinal_position`,14 [tableName]15 );1617 if (columns.rowCount === 0) {18 return {19 content: [{ type: "text", text: `Error: Table '${tableName}' not found` }],20 isError: true,21 };22 }2324 const indexes = await pool.query(25 `SELECT indexname, indexdef FROM pg_indexes26 WHERE schemaname = 'public' AND tablename = $1`,27 [tableName]28 );2930 return {31 content: [{32 type: "text",33 text: JSON.stringify({ columns: columns.rows, indexes: indexes.rows }, null, 2),34 }],35 };36 } catch (error) {37 return {38 content: [{ type: "text", text: `Error: ${(error as Error).message}` }],39 isError: true,40 };41 }42});Expected result: The AI gets column details and indexes for a specific table before constructing targeted queries.
Complete working example
1import { McpServer } from "@modelcontextprotocol/sdk/server/mcp.js";2import { StdioServerTransport } from "@modelcontextprotocol/sdk/server/stdio.js";3import { z } from "zod";4import { Pool } from "pg";56const pool = new Pool({7 connectionString: process.env.DATABASE_URL,8 max: 10,9 idleTimeoutMillis: 30000,10});1112const server = new McpServer({ name: "postgres-mcp", version: "1.0.0" });1314// Resource: full schema15server.registerResource("schema", "db://schema", {16 description: "Database schema",17 mimeType: "text/plain",18}, async (uri) => {19 const r = await pool.query(20 `SELECT table_name, column_name, data_type, is_nullable21 FROM information_schema.columns WHERE table_schema='public'22 ORDER BY table_name, ordinal_position`23 );24 const tables = new Map<string, string[]>();25 for (const row of r.rows) {26 const c = tables.get(row.table_name) || [];27 c.push(` ${row.column_name} ${row.data_type}${row.is_nullable==='NO'?' NOT NULL':''}`);28 tables.set(row.table_name, c);29 }30 const text = [...tables].map(([t,c])=>`TABLE ${t}:\n${c.join('\n')}`).join('\n\n');31 return { contents: [{ uri: uri.href, text }] };32});3334// Tool: list tables35server.registerTool("list-tables", {36 description: "List all database tables with approximate row counts",37 inputSchema: {},38}, async () => {39 const r = await pool.query(40 `SELECT relname AS table_name, n_live_tup AS row_count41 FROM pg_stat_user_tables ORDER BY n_live_tup DESC`42 );43 return { content: [{ type: "text", text: JSON.stringify(r.rows, null, 2) }] };44});4546// Tool: describe table47server.registerTool("describe-table", {48 description: "Get columns and indexes for a table",49 inputSchema: {50 table: z.string().regex(/^[a-zA-Z_]\w*$/).describe("Table name"),51 },52}, async ({ table }) => {53 const cols = await pool.query(54 `SELECT column_name,data_type,is_nullable,column_default55 FROM information_schema.columns56 WHERE table_schema='public' AND table_name=$157 ORDER BY ordinal_position`, [table]58 );59 if (!cols.rowCount) return { content:[{type:"text",text:`Table '${table}' not found`}], isError:true };60 return { content: [{ type: "text", text: JSON.stringify(cols.rows, null, 2) }] };61});6263// Tool: read-only query64server.registerTool("query", {65 description: "Run a read-only SELECT query",66 inputSchema: { sql: z.string().describe("SQL SELECT query") },67}, async ({ sql }) => {68 if (!/^SELECT\b/i.test(sql.trim())) {69 return { content:[{type:"text",text:"Error: Only SELECT allowed"}], isError:true };70 }71 try {72 const r = await pool.query(sql);73 return { content: [{ type: "text", text: JSON.stringify({74 rowCount: r.rowCount, rows: r.rows.slice(0, 100)75 }, null, 2) }] };76 } catch (e) {77 return { content:[{type:"text",text:`Error: ${(e as Error).message}`}], isError:true };78 }79});8081const transport = new StdioServerTransport();82await server.connect(transport);83console.error("PostgreSQL MCP server running");Common mistakes when building an MCP server that queries a database
Why it's a problem: Not using connection pooling
How to avoid: Create a Pool instance and reuse it across all tool calls. Creating a new connection per query exhausts database resources quickly.
Why it's a problem: Allowing arbitrary SQL without validation
How to avoid: Validate that queries start with SELECT. Block multi-statement queries containing write operations. Consider using a read-only database user.
Why it's a problem: Not limiting result size
How to avoid: Slice results to a reasonable maximum (e.g., 100 rows). Large result sets overflow the AI's context window and waste tokens.
Why it's a problem: Hardcoding the database connection string
How to avoid: Load DATABASE_URL from environment variables. Never commit connection strings to source control.
Why it's a problem: Using string concatenation for query parameters
How to avoid: Always use parameterized queries ($1, $2) with the values array. String concatenation enables SQL injection.
Best practices
- Use a dedicated read-only database user for the MCP server connection
- Always use parameterized queries ($1, $2) — never concatenate values into SQL strings
- Expose the database schema as a resource so the AI can write correct queries
- Limit query results to 100 rows to prevent context window overflow
- Set query timeouts (statement_timeout) to prevent runaway queries
- Log all queries to stderr for audit and debugging purposes
- Use connection pooling with reasonable limits (10-20 connections)
- Validate table names with regex when used in query identifiers
Still stuck?
Copy one of these prompts to get a personalized, step-by-step explanation.
I'm building an MCP server that connects to PostgreSQL. Show me how to create a read-only query tool with connection pooling, parameterized queries, SQL injection prevention, and result size limiting using the MCP TypeScript SDK and pg.
Add a database query tool to my MCP server. Use the pg Pool for connection management, validate that only SELECT queries are allowed, limit results to 100 rows, and return errors with isError: true. Also add a schema resource.
Frequently asked questions
Should I use a read-only database user for the MCP server?
Yes. Create a PostgreSQL user with SELECT-only grants and use that for the MCP connection. This provides defense in depth — even if SQL validation is bypassed, the database itself blocks writes.
How do I prevent the AI from running expensive queries?
Set statement_timeout on the database connection (e.g., SET statement_timeout = '5000' for 5 seconds). Also limit result rows and block queries without WHERE clauses on large tables.
Can I expose multiple databases from one MCP server?
Yes. Create multiple Pool instances and use tool names or parameters to route queries to the correct database. For example: query-analytics, query-users for different databases.
How do I handle database migrations with an MCP server?
Run migrations through your normal workflow (Prisma, Drizzle, raw SQL). After schema changes, the AI will see the updated schema next time it reads the schema resource. No MCP server restart needed.
Is it safe to let the AI run arbitrary SELECT queries?
With proper safeguards (read-only user, statement timeout, result limits), SELECT queries are safe. The main risk is performance — a complex JOIN on large tables can be slow. Use statement_timeout to prevent this.
Can I use this pattern with MySQL or SQLite?
Yes. Replace the pg Pool with mysql2 or better-sqlite3. The MCP tool pattern is the same — validate SQL, execute with parameterized queries, return JSON results. For multi-database MCP architectures, the RapidDev team can help design the right abstraction layer.
Talk to an Expert
Our team has built 600+ apps. Get personalized help with your project.
Book a free consultation