Skip to main content
RapidDev - Software Development Agency
openclaw-integrationsDirect API Integration

How to Connect Notion CLI to OpenClaw

To connect Notion CLI to OpenClaw, install the Notion CLI tool (`notion-cli` or the official `@notionhq/client` via Node.js), configure your NOTION_API_KEY, and use CLI commands for bulk page operations, exports, and scripted batch workflows. Unlike the direct Notion API integration (real-time CRUD), Notion CLI is optimized for batch operations — exporting large databases, running bulk updates, and scripting complex multi-step Notion workflows from shell scripts or scheduled OpenClaw tasks.

What you'll learn

  • How to install and configure the Notion CLI tool with your NOTION_API_KEY
  • How to run bulk database queries, exports, and page operations from the command line
  • How to schedule Notion CLI batch jobs through OpenClaw's task scheduler
  • How to write shell scripts combining Notion CLI with other tools for complex workflows
  • How to troubleshoot Notion CLI authentication errors and API version issues
Book a free consultation
4.9Clutch rating
600+Happy partners
17+Countries served
190+Team members
Intermediate15 min read20 minutesNotes & DocumentsMarch 2026RapidDev Engineering Team
TL;DR

To connect Notion CLI to OpenClaw, install the Notion CLI tool (`notion-cli` or the official `@notionhq/client` via Node.js), configure your NOTION_API_KEY, and use CLI commands for bulk page operations, exports, and scripted batch workflows. Unlike the direct Notion API integration (real-time CRUD), Notion CLI is optimized for batch operations — exporting large databases, running bulk updates, and scripting complex multi-step Notion workflows from shell scripts or scheduled OpenClaw tasks.

Why Use Notion CLI with OpenClaw Instead of Direct API?

The direct Notion API integration in OpenClaw is ideal for real-time, event-driven operations — when a workflow completes, create a page; when a message arrives, update a CRM entry. But there is a whole class of Notion workflows that are better expressed as batch scripts: nightly database exports to JSON, bulk updates to change all pages in a database from one status to another, archive operations that move stale entries to a separate database, weekly report generation from multiple databases, and data migration between Notion databases.

For these bulk and scheduled operations, the Notion CLI approach offers several advantages over direct API calls. CLI scripts are easier to write, read, and debug than multi-step HTTP request chains. They can be versioned as shell scripts in your project. They integrate naturally with other CLI tools — you can pipe Notion data through `jq` for transformation, use `diff` to detect changes, and combine Notion exports with file operations. OpenClaw can execute these scripts as scheduled tasks or trigger them as part of complex multi-step workflows.

This integration is specifically for builders who prefer a scripting-first approach to Notion automation. If you need live, reactive Notion updates triggered by events in real time, use the direct Notion API integration instead. Notion CLI shines for scheduled nightly jobs, one-time migrations, and exploratory data work against your Notion workspace — and it uses the same NOTION_API_KEY as the direct API integration, so both can coexist in the same OpenClaw config.

Integration method

Direct API Integration

Notion CLI wraps Notion's REST API in a command-line interface suitable for scripting, bulk operations, and scheduled batch jobs. OpenClaw invokes CLI commands via shell execution, passing your NOTION_API_KEY as an environment variable. This approach is distinct from the direct Notion API integration — where OpenClaw makes HTTP calls at runtime — because CLI commands are designed for batch workflows, scheduled maintenance tasks, and export pipelines rather than real-time event-driven CRUD operations.

Prerequisites

  • A Notion account with databases you want to operate on, and a NOTION_API_KEY from notion.so/my-integrations
  • Node.js installed (version 16+) for running the Notion CLI and client library
  • OpenClaw installed and running with shell execution capabilities
  • Notion databases shared with your integration via the 'Connect to' menu in Notion
  • Basic familiarity with shell scripting, JSON, and command-line tools

Step-by-step guide

1

Install the Notion CLI and Client Library

Several Notion CLI tools exist. The most commonly used options are: 1. **@notionhq/client** — Notion's official JavaScript SDK. Not a CLI itself, but with a simple wrapper script becomes one. Most up-to-date with the API. 2. **notion-cli-js** — An npm package providing `notion` CLI commands for common operations (query, create, update, search). 3. **notion-export-types** — Specialized for exporting Notion content to various formats. For general-purpose batch operations with OpenClaw, the recommended approach is to install `@notionhq/client` (the official SDK) and write thin Node.js scripts that serve as your 'CLI'. This gives full API coverage rather than being limited to what a third-party CLI exposes. Alternatively, if you prefer a ready-made CLI without writing scripts, `notion-cli` (community package) provides commands like `notion-cli pages list`, `notion-cli db query`, and `notion-cli page create`. Install your chosen tool globally so OpenClaw can invoke it from any working directory.

terminal
1# Option A: Install official Notion SDK for scripting:
2npm install -g @notionhq/client
3# Verify:
4node -e "const { Client } = require('@notionhq/client'); console.log('Notion SDK ready')"
5
6# Option B: Install community Notion CLI:
7npm install -g notion-cli-js
8# Verify:
9notion-cli --version
10
11# Option C: Use npx for one-off operations without global install:
12npx @notionhq/client --version
13
14# Test with your API key:
15export NOTION_API_KEY="secret_xxxxxxxxxxxxxxxxxxxx"
16node -e "
17const { Client } = require('@notionhq/client');
18const notion = new Client({ auth: process.env.NOTION_API_KEY });
19notion.users.me().then(u => console.log('Connected as:', u.type));
20"

Pro tip: Use `npm install -g` rather than a project-local install for CLI tools that OpenClaw will invoke from shell commands — global installs are accessible from any directory, while local installs require specifying the path.

Expected result: The Notion SDK or CLI is installed globally and the connection test returns your bot user details, confirming the NOTION_API_KEY is valid.

2

Configure NOTION_API_KEY in OpenClaw

OpenClaw needs to know the NOTION_API_KEY so it can pass it to CLI commands as an environment variable. Configure this in `~/.openclaw/config.yaml` under the `notion_cli` integration block. Unlike the direct Notion API integration (which makes HTTP calls through OpenClaw's HTTP client), the CLI integration works by setting environment variables when OpenClaw executes shell commands. The `env` block in the config defines which environment variables are injected into shell command execution. Also configure the `script_dir` — the directory where your Notion CLI scripts are stored. OpenClaw can reference scripts by name in workflow configurations without needing full paths. The `databases` block (same as the direct Notion integration) provides friendly names for database IDs, making scripts more readable. You can use the same database IDs as your direct Notion config if both integrations are active.

~/.openclaw/config.yaml
1# ~/.openclaw/config.yaml
2integrations:
3 notion_cli:
4 env:
5 NOTION_API_KEY: "${NOTION_API_KEY}" # Injected into all CLI command environments
6 NOTION_VERSION: "2022-06-28"
7 script_dir: ~/.openclaw/notion-scripts # Directory for Notion CLI scripts
8 node_path: /usr/local/bin/node # Path to node binary (run: which node)
9 databases:
10 projects: "a1b2c3d4e5f6g7h8i9j0k1l2m3n4o5p6"
11 tasks: "b2c3d4e5f6g7h8i9j0k1l2m3n4o5p6q7"
12 reports: "c3d4e5f6g7h8i9j0k1l2m3n4o5p6r8s9"
13 timeout: 120 # Max seconds for a CLI command to run
14 output_format: json # Expected output format from scripts

Pro tip: Always use `${NOTION_API_KEY}` (environment variable substitution) rather than pasting the actual key in the config. Export `NOTION_API_KEY` in your shell profile (`~/.zshrc` or `~/.bash_profile`) and OpenClaw will expand the reference automatically.

Expected result: `clawhub reload` completes without errors. Running `clawhub run notion_cli test` (if available) confirms the integration can execute shell commands with the correct environment variables.

3

Write Your First Notion CLI Script

Create the script directory and write a reusable script for database querying. This example script takes a database ID and optional filter as arguments, queries Notion, and outputs the results as JSON to stdout — which OpenClaw can capture and pass to downstream workflow steps. Store scripts in `~/.openclaw/notion-scripts/`. Each script should: 1. Read configuration from environment variables (NOTION_API_KEY, database IDs) 2. Accept parameters as command-line arguments 3. Output results to stdout as JSON 4. Exit with code 0 on success, non-zero on failure 5. Write error details to stderr (not stdout) so OpenClaw can distinguish errors from data This script structure makes your Notion scripts composable with other shell tools and easy for OpenClaw to invoke and parse.

~/.openclaw/notion-scripts/query-database.js
1#!/usr/bin/env node
2// ~/.openclaw/notion-scripts/query-database.js
3// Usage: node query-database.js DATABASE_ID [status_filter]
4
5const { Client } = require('@notionhq/client')
6
7const notion = new Client({ auth: process.env.NOTION_API_KEY })
8const databaseId = process.argv[2]
9const statusFilter = process.argv[3]
10
11if (!databaseId) {
12 process.stderr.write('Usage: node query-database.js DATABASE_ID [status_filter]\n')
13 process.exit(1)
14}
15
16async function queryDatabase() {
17 const filter = statusFilter ? {
18 property: 'Status',
19 select: { equals: statusFilter }
20 } : undefined
21
22 const response = await notion.databases.query({
23 database_id: databaseId,
24 filter,
25 sorts: [{ property: 'Created', direction: 'descending' }],
26 page_size: 100
27 })
28
29 // Output clean JSON to stdout
30 const results = response.results.map(page => ({
31 id: page.id,
32 title: page.properties.Name?.title?.[0]?.text?.content || 'Untitled',
33 status: page.properties.Status?.select?.name || null,
34 created: page.created_time
35 }))
36 console.log(JSON.stringify({ count: results.length, items: results }, null, 2))
37}
38
39queryDatabase().catch(err => {
40 process.stderr.write(`Error: ${err.message}\n`)
41 process.exit(1)
42})

Pro tip: Always map Notion's nested property structure to a flat, clean JSON output in your scripts. Notion's raw API response has deeply nested property objects — the `page.properties.Name.title[0].text.content` pattern is typical. Flatten this in the script so downstream workflow steps receive simple key-value objects.

Expected result: Running `node ~/.openclaw/notion-scripts/query-database.js YOUR_DATABASE_ID` returns a clean JSON array of database entries with count, title, status, and created date.

4

Write a Bulk Update Script

The key differentiator of the CLI approach is batch operations. The direct Notion API integration in OpenClaw works one page at a time, triggered by events. For bulk operations — update 200 entries, export 500 pages, change status across an entire database — a script is far more practical. This script queries a database for all entries matching a condition, then updates each one. It uses Notion's built-in pagination (each query returns maximum 100 results) to handle databases larger than 100 entries, and logs progress to stderr while outputting a summary to stdout. Before running bulk updates in production, add a `--dry-run` flag to your scripts that outputs what would be changed without actually making the changes. RapidDev's team strongly recommends this practice — bulk API operations are not easily reversible, and a dry run helps catch mistakes before they affect real data.

~/.openclaw/notion-scripts/bulk-update-status.js
1#!/usr/bin/env node
2// ~/.openclaw/notion-scripts/bulk-update-status.js
3// Usage: node bulk-update-status.js DATABASE_ID OLD_STATUS NEW_STATUS [--dry-run]
4
5const { Client } = require('@notionhq/client')
6const notion = new Client({ auth: process.env.NOTION_API_KEY })
7
8const [,, databaseId, oldStatus, newStatus] = process.argv
9const dryRun = process.argv.includes('--dry-run')
10
11async function getAllPages(dbId, filterStatus) {
12 let allPages = []
13 let cursor
14 do {
15 const response = await notion.databases.query({
16 database_id: dbId,
17 start_cursor: cursor,
18 filter: { property: 'Status', select: { equals: filterStatus } },
19 page_size: 100
20 })
21 allPages = allPages.concat(response.results)
22 cursor = response.has_more ? response.next_cursor : null
23 process.stderr.write(`Fetched ${allPages.length} pages...\n`)
24 } while (cursor)
25 return allPages
26}
27
28async function run() {
29 const pages = await getAllPages(databaseId, oldStatus)
30 process.stderr.write(`Found ${pages.length} pages with status '${oldStatus}'\n`)
31 if (dryRun) {
32 console.log(JSON.stringify({ dry_run: true, would_update: pages.length, from: oldStatus, to: newStatus }))
33 return
34 }
35 let updated = 0
36 for (const page of pages) {
37 await notion.pages.update({
38 page_id: page.id,
39 properties: { Status: { select: { name: newStatus } } }
40 })
41 updated++
42 await new Promise(r => setTimeout(r, 350)) // Respect rate limit (3 req/sec)
43 }
44 console.log(JSON.stringify({ updated, from: oldStatus, to: newStatus }))
45}
46
47run().catch(err => { process.stderr.write(err.message + '\n'); process.exit(1) })

Pro tip: The `await new Promise(r => setTimeout(r, 350))` delay between page updates keeps you under Notion's 3-requests-per-second rate limit. At 350ms between calls, you stay at approximately 2.8 requests/second — safely below the limit even accounting for other API calls.

Expected result: Running the script with `--dry-run` shows the count of pages that would be updated. Running without `--dry-run` updates all matching entries and outputs the update count as JSON.

5

Schedule Batch Jobs in OpenClaw

With scripts written and tested, configure OpenClaw to run them on a schedule using the workflow scheduler. Scheduled Notion CLI jobs are the primary production use case for this integration — nightly backups, weekly report generation, monthly archiving, and data hygiene tasks. Configure scheduled workflows in OpenClaw that execute shell commands pointing to your scripts. Pass database IDs as arguments using the friendly names from your config. Capture stdout output and optionally log it to another Notion database for audit trail. For jobs that should only run on non-error conditions (e.g., only archive after a successful export), use workflow conditionals that check the exit code and output of preceding steps before proceeding. Monitor scheduled job execution in OpenClaw's log. Set up an alert (via Telegram or Email Daily Summary) if a critical scheduled Notion job fails — silent failures in nightly backups are particularly problematic because they go unnoticed until you actually need the backup.

~/.openclaw/config.yaml
1# ~/.openclaw/config.yaml scheduled Notion CLI batch jobs
2workflows:
3 notion-nightly-export:
4 schedule: "0 23 * * *" # Run at 11 PM every night
5 timezone: "America/New_York"
6 steps:
7 - name: export-tasks-db
8 type: shell
9 command: node ~/.openclaw/notion-scripts/query-database.js
10 args:
11 - "{{config.integrations.notion_cli.databases.tasks}}"
12 capture_stdout: true
13 on_failure: notify_and_stop
14 - name: save-export
15 type: shell
16 command: bash
17 args:
18 - "-c"
19 - "echo '{{steps.export-tasks-db.output}}' > ~/.openclaw/backups/tasks-$(date +%Y%m%d).json"
20 - name: cleanup-old-backups
21 type: shell
22 command: find
23 args:
24 - "~/.openclaw/backups"
25 - "-name"
26 - "tasks-*.json"
27 - "-mtime"
28 - "+30"
29 - "-delete"
30
31 notion-weekly-stale-cleanup:
32 schedule: "0 9 * * 1" # Every Monday at 9 AM
33 steps:
34 - name: archive-stale-tasks
35 type: shell
36 command: node ~/.openclaw/notion-scripts/bulk-update-status.js
37 args:
38 - "{{config.integrations.notion_cli.databases.tasks}}"
39 - "In Progress"
40 - "Stale"

Pro tip: Create a `~/.openclaw/backups/` directory before scheduling the nightly export job. OpenClaw's shell execution will fail if the target directory does not exist when the script runs at 11 PM when you are not watching.

Expected result: Scheduled jobs appear in `clawhub schedule list` with their next run times. Running `clawhub run notion-nightly-export` manually confirms the scripts execute correctly before the scheduled run.

Common use cases

Nightly Database Export and Backup

Schedule a nightly OpenClaw task that exports a Notion database to JSON using the Notion CLI, processes the output with jq, and saves timestamped backup files. Provides a local backup of Notion data that is independent of Notion's built-in export features.

OpenClaw Prompt

Configure OpenClaw to run a nightly Notion CLI export of my 'Projects' database at 11 PM, save the output as a JSON file with the date in the filename, and keep the last 30 days of backups

Copy this prompt to try it in OpenClaw

Bulk Status Updates Across Large Databases

Run a batch update operation that changes the status property of multiple Notion database entries at once — mark all tasks older than 90 days as 'Archived', close all open projects from a previous quarter, or reset experiment statuses at the start of a new sprint. Impossible to do efficiently one-by-one through the UI.

OpenClaw Prompt

Write a Notion CLI script for OpenClaw that finds all entries in my 'Tasks' database where Status = 'In Progress' and Date < 30 days ago, and updates them to Status = 'Stale' with a note in the Comments field

Copy this prompt to try it in OpenClaw

Cross-Database Report Generation

Pull data from multiple Notion databases, join them on a common property (project name, user ID, etc.), and generate a structured report page in a reporting database. Automate the weekly or monthly report generation that previously required manual copying between databases.

OpenClaw Prompt

Create an OpenClaw workflow that runs every Monday at 9 AM: query my 'Completed Tasks' and 'Time Log' databases for last week's entries, join them by project name, calculate totals, and create a new 'Weekly Report' page with the results

Copy this prompt to try it in OpenClaw

Troubleshooting

Node.js scripts fail with 'Cannot find module @notionhq/client'

Cause: The Notion SDK was installed locally (in a project directory) rather than globally, or the global npm path is not in OpenClaw's PATH environment when executing shell commands.

Solution: Install globally with `npm install -g @notionhq/client`. If you prefer local installs, set the `node_modules` path explicitly in your script using `require('/full/path/to/node_modules/@notionhq/client')`. Check global install location with `npm root -g`.

typescript
1# Check global npm install location:
2npm root -g
3# e.g.: /usr/local/lib/node_modules
4
5# Verify global install:
6ls $(npm root -g)/@notionhq/client
7
8# Reinstall globally:
9npm install -g @notionhq/client

NOTION_API_KEY is undefined in scripts — 'Client is not authorized'

Cause: The environment variable is not being passed to the shell command execution context. OpenClaw injects configured env vars, but if the variable is not in the `env` block or the export is missing from the shell profile, it will be undefined.

Solution: Add `NOTION_API_KEY: "${NOTION_API_KEY}"` to the `env` block in your `notion_cli` config. Also ensure the variable is exported in your shell profile (`export NOTION_API_KEY=secret_xxx` in `~/.zshrc` or `~/.bash_profile`). Run `source ~/.zshrc` after editing to apply changes to the current session.

typescript
1# Test that the variable is exported in your shell:
2echo $NOTION_API_KEY
3# If empty, add to shell profile and re-source:
4echo 'export NOTION_API_KEY=secret_xxxx' >> ~/.zshrc
5source ~/.zshrc

Bulk update script hits rate limit — '429 Too Many Requests'

Cause: Notion enforces 3 requests per second per integration. Scripts without delays between API calls exceed this limit on databases with many entries.

Solution: Add a delay between API calls in your scripts. Use `await new Promise(r => setTimeout(r, 350))` between updates to stay under the 3/second limit. For read-heavy scripts, cache intermediate results and minimize redundant API calls.

typescript
1# Add rate limiting to any loop that makes multiple API calls:
2for (const page of pages) {
3 await notion.pages.update({ page_id: page.id, properties: { ... } })
4 await new Promise(r => setTimeout(r, 350)) // ~2.8 requests/sec
5}

Script exits silently without output — no error, no results

Cause: The database query returned zero results (filter matched nothing), or the database has not been shared with the integration and returns an empty result set rather than a 404 (for some query types).

Solution: Add logging to your scripts: output the count of results fetched before processing. Verify the database has been shared with the integration using `GET /v1/databases/{id}`. Check filter conditions — typos in status values are a common cause of zero-result queries.

typescript
1# Add count logging to query scripts:
2console.log(JSON.stringify({ count: results.length, message: results.length === 0 ? 'No results found — check filter conditions and database access' : 'OK' }))

Best practices

  • Always test scripts with a `--dry-run` flag before running bulk updates in production — output what would change without making API calls, since bulk Notion operations are difficult to reverse without a prior backup.
  • Store NOTION_API_KEY as an environment variable and inject it via OpenClaw's env block rather than hardcoding in scripts — scripts stored in version control should not contain credentials.
  • Add explicit rate limiting delays (350ms between calls) to every script that makes multiple sequential API calls — Notion's 3 requests/second limit is easy to exceed accidentally in loops over large databases.
  • Write scripts to output clean JSON to stdout and status/error messages to stderr — this separation lets OpenClaw capture and parse script output while keeping debug information accessible in logs without mixing with data.
  • Create dedicated test databases in Notion (duplicates of your real databases with sample data) for validating scripts before running them against production data — bulk update mistakes in production data are painful to reverse.
  • Add exit code handling: scripts should exit 0 on success and non-zero on any error condition, and always write a meaningful error message to stderr before exiting with an error code so OpenClaw can surface failures clearly in logs.
  • Schedule heavy batch jobs during off-hours (late night or early morning) to avoid competing with real-time API calls from the direct Notion integration during working hours — this keeps rate limit headroom for both integration modes.

Alternatives

Frequently asked questions

How do I set up Notion CLI integration in OpenClaw?

Install `@notionhq/client` globally with `npm install -g @notionhq/client`, get your NOTION_API_KEY from notion.so/my-integrations, and configure it under `integrations.notion_cli.env.NOTION_API_KEY` in `~/.openclaw/config.yaml`. Write Node.js scripts in your configured `script_dir` and invoke them via OpenClaw's shell workflow steps. Share target Notion databases with your integration in Notion's UI via 'Connect to'.

What is the difference between Notion and Notion CLI integration in OpenClaw?

The direct Notion API integration makes real-time HTTP calls at event time — when a workflow fires, it creates or updates a Notion page immediately. Notion CLI integration runs shell scripts for batch operations on a schedule — exporting databases, running bulk updates, archiving old entries. Both use the same NOTION_API_KEY and can coexist in the same OpenClaw config for complementary use cases.

Do I need special permissions for bulk Notion operations via CLI?

No — the same NOTION_API_KEY and integration permissions used for individual page operations also apply to bulk operations. The API treats each call in a bulk loop identically to a single call. The only practical constraint is Notion's 3-requests-per-second rate limit, which your scripts must handle with explicit delays between calls.

Can OpenClaw Notion CLI export Notion content as Markdown or PDF?

Notion's official API does not natively export to Markdown or PDF format — it returns page content as block-type JSON. To export as Markdown, you would use a library like `notion-to-md` (npm) that converts Notion blocks to Markdown strings. For PDF export, Notion's web UI export (File > Export) is still the most reliable method as Notion has not exposed PDF export in their API.

How do I handle Notion database pagination in CLI scripts?

Notion API queries return maximum 100 results per call. For databases with more than 100 entries, check `response.has_more` — if true, use `response.next_cursor` as the `start_cursor` parameter on the next query call. Loop until `has_more` is false to retrieve all pages. The bulk-update script example in Step 4 shows a complete pagination loop implementation.

Is there help available for complex Notion CLI batch workflows in OpenClaw?

RapidDev's team builds custom Notion CLI scripts for OpenClaw including bulk database sync, cross-database report generation, nightly backup pipelines, and data migration scripts. If you need a complex batch workflow — large-scale status updates, cross-database joins, or scheduled data hygiene — reach out for hands-on script development support.

RapidDev

Talk to an Expert

Our team has built 600+ apps. Get personalized help with your project.

Book a free consultation

Need help with your project?

Our experts have built 600+ apps and can accelerate your development. Book a free consultation — no strings attached.

Book a free consultation

We put the rapid in RapidDev

Need a dedicated strategic tech and growth partner? Discover what RapidDev can do for your business! Book a call with our team to schedule a free, no-obligation consultation. We'll discuss your project and provide a custom quote at no cost.