Skip to main content
RapidDev - Software Development Agency
cursor-tutorial

How to Generate Advanced Database Indexes with Cursor

Cursor can generate basic CREATE INDEX statements but often misses advanced PostgreSQL features like partial indexes, expression indexes, and covering indexes. By providing schema context via @file, using a .cursor/rules/ entry with index best practices, and prompting with specific performance requirements, you can get Cursor to produce optimized index definitions that match your query patterns.

What you'll learn

  • How to prompt Cursor to generate partial and expression indexes
  • How to provide schema context so Cursor understands your table structure
  • How to create rules that guide Cursor toward optimal index strategies
  • How to use Cursor Chat to analyze slow queries and suggest indexes
Book a free consultation
4.9Clutch rating
600+Happy partners
17+Countries served
190+Team members
Beginner7 min read10-15 minCursor Free+, PostgreSQL projectsMarch 2026RapidDev Engineering Team
TL;DR

Cursor can generate basic CREATE INDEX statements but often misses advanced PostgreSQL features like partial indexes, expression indexes, and covering indexes. By providing schema context via @file, using a .cursor/rules/ entry with index best practices, and prompting with specific performance requirements, you can get Cursor to produce optimized index definitions that match your query patterns.

Getting advanced PostgreSQL indexes from Cursor

PostgreSQL supports partial indexes, expression indexes, GIN indexes for JSONB, and covering indexes that can dramatically improve query performance. Cursor often defaults to simple B-tree indexes on single columns. This tutorial teaches you how to provide enough context and rules so Cursor generates indexes tailored to your actual query patterns and data distribution.

Prerequisites

  • Cursor installed with a PostgreSQL project open
  • Basic understanding of SQL indexes and EXPLAIN output
  • Access to your database schema or migration files
  • Familiarity with Cursor Chat (Cmd+L)

Step-by-step guide

1

Create a database indexing rule for Cursor

Add a project rule with PostgreSQL indexing best practices. This tells Cursor to consider partial indexes, expression indexes, and covering indexes instead of defaulting to simple single-column B-tree indexes on every column.

.cursor/rules/postgres-indexes.mdc
1---
2description: PostgreSQL indexing best practices for generated migrations
3globs: "*.sql,*.migration.ts,**/migrations/**"
4alwaysApply: false
5---
6
7# PostgreSQL Index Rules
8- Consider partial indexes (WHERE clause) when queries filter on a common condition
9- Use expression indexes for computed lookups like LOWER(email)
10- Use GIN indexes for JSONB columns and full-text search
11- Use covering indexes (INCLUDE) to enable index-only scans
12- Always add CONCURRENTLY to CREATE INDEX in production migrations
13- Name indexes descriptively: idx_{table}_{columns}_{type}
14- Add comments explaining which query pattern each index supports
15- Never create indexes on low-cardinality boolean columns without a partial WHERE clause

Expected result: Cursor considers advanced index types when generating database migration code.

2

Provide your schema as context for index generation

Cursor needs to see your table definitions to generate useful indexes. Open Chat with Cmd+L and reference your schema file directly. Include information about your most common query patterns so Cursor can match indexes to actual usage.

Cmd+L prompt
1@src/db/schema.sql @postgres-indexes.mdc
2
3Here are our most common query patterns for the orders table:
41. Find active orders for a specific user: WHERE user_id = $1 AND status = 'active'
52. Search orders by date range: WHERE created_at BETWEEN $1 AND $2
63. Find orders by lowercase email: WHERE LOWER(customer_email) = $1
74. Full-text search on order notes: WHERE notes @@ to_tsquery($1)
85. Dashboard count of pending orders: WHERE status = 'pending'
9
10Generate optimal PostgreSQL indexes for each query pattern.
11Use partial indexes where most rows would be excluded.
12Include CONCURRENTLY and descriptive names.

Pro tip: Paste EXPLAIN ANALYZE output from slow queries directly into the Chat. Cursor reads query plans and can suggest indexes specifically targeting sequential scans.

Expected result: Cursor generates five targeted indexes including partial indexes with WHERE clauses, an expression index on LOWER(email), and a GIN index for full-text search.

3

Generate a migration file with the indexes

Ask Cursor to wrap the generated indexes in a proper migration file. Reference your migration tool so Cursor generates the correct format. This step produces a ready-to-run migration rather than loose SQL statements.

Cmd+L prompt
1@postgres-indexes.mdc
2
3Create a database migration file that adds these indexes to the orders table:
41. Partial index on (user_id) WHERE status = 'active'
52. Partial index on (created_at) WHERE status != 'archived'
63. Expression index on LOWER(customer_email)
74. GIN index on notes using to_tsvector('english', notes)
85. Partial index on (status) WHERE status = 'pending' for dashboard counts
9
10Use CREATE INDEX CONCURRENTLY for all indexes.
11Format as a Knex.js migration with up and down functions.

Expected result: Cursor generates a complete Knex migration file with five CONCURRENTLY-created indexes and proper rollback in the down function.

4

Use Cursor to analyze query plans and suggest improvements

Copy the output of EXPLAIN ANALYZE from a slow query and paste it into Cursor Chat. Cursor can read query plans and identify missing indexes, inefficient sequential scans, and opportunities for covering indexes that enable index-only scans.

Cmd+L prompt
1@postgres-indexes.mdc
2
3Here is the EXPLAIN ANALYZE output for a slow query:
4
5Seq Scan on orders (cost=0.00..15420.00 rows=52 width=204)
6 Filter: ((status = 'active') AND (user_id = 'abc-123'))
7 Rows Removed by Filter: 499948
8Planning Time: 0.15 ms
9Execution Time: 142.8 ms
10
11The table has 500,000 rows. Only 2% have status = 'active'.
12Suggest the optimal index to eliminate the sequential scan.
13Explain why a partial index is better than a regular index here.

Expected result: Cursor recommends a partial index on (user_id) WHERE status = 'active' and explains that it covers only 2% of rows, making it smaller and faster than a full index.

5

Verify indexes with an EXPLAIN check prompt

After applying your migration, use Cursor to generate verification queries. This step ensures the indexes are actually being used by your query patterns and not being ignored by the PostgreSQL query planner.

Cmd+L prompt
1@postgres-indexes.mdc
2
3Generate EXPLAIN ANALYZE statements to verify these indexes are used:
41. SELECT * FROM orders WHERE user_id = $1 AND status = 'active'
52. SELECT * FROM orders WHERE LOWER(customer_email) = $1
63. SELECT * FROM orders WHERE notes @@ to_tsquery('english', $1)
74. SELECT COUNT(*) FROM orders WHERE status = 'pending'
8
9For each, show what the expected plan should look like
10(Index Scan or Index Only Scan, not Seq Scan).
11Include the SET enable_seqscan = off trick for testing.

Expected result: Cursor generates EXPLAIN ANALYZE queries for each pattern and describes the expected Index Scan behavior.

Complete working example

migrations/20260325_add_order_indexes.ts
1import { Knex } from 'knex';
2
3export async function up(knex: Knex): Promise<void> {
4 // Partial index: active orders by user (covers 2% of rows)
5 await knex.raw(`
6 CREATE INDEX CONCURRENTLY IF NOT EXISTS idx_orders_user_id_active
7 ON orders (user_id)
8 WHERE status = 'active'
9 `);
10
11 // Partial index: date range queries excluding archived
12 await knex.raw(`
13 CREATE INDEX CONCURRENTLY IF NOT EXISTS idx_orders_created_at_not_archived
14 ON orders (created_at)
15 WHERE status != 'archived'
16 `);
17
18 // Expression index: case-insensitive email lookup
19 await knex.raw(`
20 CREATE INDEX CONCURRENTLY IF NOT EXISTS idx_orders_email_lower
21 ON orders (LOWER(customer_email))
22 `);
23
24 // GIN index: full-text search on notes
25 await knex.raw(`
26 CREATE INDEX CONCURRENTLY IF NOT EXISTS idx_orders_notes_fts
27 ON orders USING GIN (to_tsvector('english', notes))
28 `);
29
30 // Partial index: pending status for dashboard counts
31 await knex.raw(`
32 CREATE INDEX CONCURRENTLY IF NOT EXISTS idx_orders_status_pending
33 ON orders (status)
34 WHERE status = 'pending'
35 `);
36}
37
38export async function down(knex: Knex): Promise<void> {
39 await knex.raw('DROP INDEX CONCURRENTLY IF EXISTS idx_orders_user_id_active');
40 await knex.raw('DROP INDEX CONCURRENTLY IF EXISTS idx_orders_created_at_not_archived');
41 await knex.raw('DROP INDEX CONCURRENTLY IF EXISTS idx_orders_email_lower');
42 await knex.raw('DROP INDEX CONCURRENTLY IF EXISTS idx_orders_notes_fts');
43 await knex.raw('DROP INDEX CONCURRENTLY IF EXISTS idx_orders_status_pending');
44}

Common mistakes when generating Advanced Database Indexes with Cursor

Why it's a problem: Asking Cursor for indexes without providing the table schema

How to avoid: Always reference your schema file with @file and describe your most common query patterns in the prompt.

Why it's a problem: Creating indexes without CONCURRENTLY on production tables

How to avoid: Include CONCURRENTLY in your indexing rules. Note that CONCURRENTLY cannot run inside a transaction, so use knex.raw instead of the schema builder.

Why it's a problem: Creating full indexes on boolean or low-cardinality columns

How to avoid: Specify in your rules to use partial indexes for columns where the filtered condition matches a small percentage of total rows.

Best practices

  • Always provide EXPLAIN ANALYZE output when asking Cursor to optimize queries
  • Include data distribution estimates in your prompts so Cursor can judge index selectivity
  • Use descriptive index names with the pattern idx_{table}_{columns}_{type}
  • Add SQL comments above each index explaining which query it supports
  • Test index usage with EXPLAIN ANALYZE after creation to verify the planner uses them
  • Keep indexing rules separate from general SQL rules for cleaner rule management
  • Review Cursor-generated indexes against your actual slow query log before deploying

Still stuck?

Copy one of these prompts to get a personalized, step-by-step explanation.

ChatGPT Prompt

I have a PostgreSQL orders table with 500K rows. Only 2% have status='active'. Generate a partial index for queries that filter on user_id WHERE status='active'. Explain why a partial index is better than a regular composite index here.

Cursor Prompt

@postgres-indexes.mdc @src/db/schema.sql Analyze the orders table schema and generate optimal PostgreSQL indexes for these query patterns: active orders by user, date range lookups, case-insensitive email search, and full-text search on notes. Use partial indexes and CONCURRENTLY.

Frequently asked questions

Can Cursor read my actual database to suggest indexes?

Not directly, but you can use a Database MCP server to connect Cursor to your database. Alternatively, paste EXPLAIN ANALYZE output and table statistics into Chat for Cursor to analyze.

Should I create indexes on every column Cursor suggests?

No. Each index adds write overhead. Only create indexes for columns that appear in frequent WHERE, JOIN, or ORDER BY clauses. Use pg_stat_user_indexes to check if existing indexes are actually being used.

How do I know if a partial index is worth it?

A partial index is beneficial when the WHERE condition matches a small fraction of total rows, typically under 20%. The smaller the fraction, the bigger the performance gain compared to a full index.

Does Cursor know about PostgreSQL 16 features?

Cursor's knowledge depends on the underlying model. Reference @docs with the PostgreSQL 16 documentation URL to ensure Cursor generates syntax compatible with the latest features.

Can I use this approach with MySQL or SQLite?

The principles apply, but the syntax differs. MySQL supports partial indexes only through generated columns, and SQLite has limited partial index support. Adjust your rules file for your specific database.

Can RapidDev help optimize our database indexes?

Yes. RapidDev provides database performance audits that analyze slow query logs, identify missing indexes, and configure Cursor rules for ongoing index optimization as your schema evolves.

RapidDev

Talk to an Expert

Our team has built 600+ apps. Get personalized help with your project.

Book a free consultation

Need help with your project?

Our experts have built 600+ apps and can accelerate your development. Book a free consultation — no strings attached.

Book a free consultation

We put the rapid in RapidDev

Need a dedicated strategic tech and growth partner? Discover what RapidDev can do for your business! Book a call with our team to schedule a free, no-obligation consultation. We'll discuss your project and provide a custom quote at no cost.