UserTesting does not offer a public REST API — integration with Bolt.new means using UserTesting alongside your app rather than embedding it technically. The most practical approach is sharing your deployed Bolt app URL with UserTesting testers for unmoderated testing sessions. For in-app feedback, build a custom feedback widget in Bolt (star rating, NPS survey, open text) or add FullStory for automatic session replay instead of waiting for scheduled UserTesting sessions.
UserTesting with Bolt.new: Honest Guidance on What's Possible and What Actually Works
UserTesting is a premium UX research platform that provides video recordings of real users navigating your product while narrating their thoughts. It is genuinely valuable for understanding why users struggle with specific workflows — a single UserTesting session often reveals usability issues that months of analytics data obscures. However, UserTesting's API access is enterprise-only and not publicly documented, which means there is no practical code-based integration with Bolt.new apps.
The real-world workflow is straightforward: deploy your Bolt app to Bolt Cloud, Netlify, or any hosting platform to get a public URL, then share that URL with UserTesting when setting up a test. UserTesting testers receive the link, visit your app, and complete tasks you specify in the UserTesting platform. You watch the recordings in the UserTesting dashboard. The 'integration' is the shared URL, not an API connection.
For Bolt developers who want automated user feedback without UserTesting's enterprise pricing ($49,000+/year for large teams, though self-serve plans are available starting around $299/month), there are better alternatives that actually connect to Bolt apps programmatically. This tutorial covers three complementary approaches: deploying Bolt apps in a UserTesting-compatible way, building custom in-app feedback widgets that capture qualitative feedback continuously, and using FullStory's session replay for the automated observation capabilities that UserTesting provides through scheduled sessions.
Integration method
UserTesting has no public REST API for programmatic integration — the platform is accessed entirely through its web dashboard. The practical 'integration' with Bolt.new is sharing your deployed app URL with UserTesting testers for remote unmoderated sessions. For automated, in-app user research collection, build a custom feedback widget in Bolt (NPS surveys, rating forms, session context capture) that stores responses in your database. FullStory is the closest automated-replay alternative to UserTesting if budget is a constraint.
Prerequisites
- A deployed Bolt app with a public URL (Bolt Cloud, Netlify, or Vercel) — UserTesting requires a live URL, not a WebContainer preview
- A UserTesting account (self-serve plans start at ~$299/month; enterprise plans for larger teams)
- For the custom NPS widget: a Supabase database with authentication to store and associate feedback with users
- For Typeform embed: a Typeform account with a survey created and a form ID
Step-by-step guide
Deploy Your Bolt App and Prepare It for External Testing
Deploy Your Bolt App and Prepare It for External Testing
UserTesting requires a publicly accessible URL that panel testers can visit without creating accounts or knowing your internal systems. Before setting up a UserTesting study, deploy your Bolt app to a production hosting environment. Bolt Cloud is the simplest option: click 'Publish' in the top-right corner of the Bolt editor. Your app deploys to a .bolt.host URL with SSL automatically. You can also connect a custom domain in Bolt's Settings. Netlify is the alternative: go to Settings → Applications → Netlify → Connect, then click Publish. Both give you a stable HTTPS URL you can share with UserTesting. For UserTesting sessions, consider whether you want testers to use real functionality or a demo mode. Real functionality requires testers to create accounts, which adds friction and can fail if your auth flow is confusing. A demo mode with pre-filled data and a guest login is often better for testing UI flows. Create a demo user account in your Supabase Auth with a simple password and include the credentials in your UserTesting task instructions. Verify your deployed app is ready for external users: check that all pages load without errors, navigation links work, mobile responsive layout functions correctly, and error messages are user-friendly. UserTesting session recordings that surface technical error messages ('MongooseServerSelectionError') rather than helpful UI messages create noise in your research. Invest 30 minutes testing your own deployment before sending it to UserTesting panelists. Note: Bolt's WebContainer preview URL (the development URL inside Bolt's editor) is not suitable for UserTesting. The preview URL is session-specific, requires Bolt login context, and will not load reliably for external testers. Always use your deployed URL.
Prepare my Bolt app for external user testing. Create a /demo page that automatically logs in with demo credentials (email: demo@example.com, password to be set in environment variables as DEMO_USER_PASSWORD) and redirects to the main app with pre-populated sample data. Add a dismissible banner at the top of the demo page that says 'You are viewing a demo version of this app. Your changes will not be saved.' Create a reset-demo API route at /api/demo/reset that clears any changes made by the demo user and resets sample data to its initial state.
Paste this in Bolt.new chat
1// app/api/demo/reset/route.ts2import { NextResponse } from 'next/server';3import { createClient } from '@supabase/supabase-js';45const supabase = createClient(6 process.env.NEXT_PUBLIC_SUPABASE_URL!,7 process.env.SUPABASE_SERVICE_ROLE_KEY! // service role to bypass RLS8);910const DEMO_USER_ID = process.env.DEMO_USER_ID!;1112export async function POST() {13 try {14 // Delete all demo user's created items15 await supabase16 .from('user_projects')17 .delete()18 .eq('user_id', DEMO_USER_ID);1920 // Reset demo user profile to defaults21 await supabase22 .from('profiles')23 .update({ onboarding_complete: false, settings: {} })24 .eq('id', DEMO_USER_ID);2526 return NextResponse.json({ success: true, message: 'Demo reset complete' });27 } catch (error: unknown) {28 const e = error as { message: string };29 return NextResponse.json({ error: e.message }, { status: 500 });30 }31}Pro tip: Create a dedicated Supabase user for demo purposes and store their user ID in DEMO_USER_ID environment variable. Never use a real user's credentials for UserTesting sessions — always use the throwaway demo account to protect real user data.
Expected result: Your Bolt app is deployed to a public URL (e.g., yourapp.bolt.host or yourapp.netlify.app). The demo login flow works without manual account creation. UserTesting testers can access all features using the demo credentials you provide in the task instructions.
Build a Custom NPS Survey Widget for Continuous Feedback
Build a Custom NPS Survey Widget for Continuous Feedback
While UserTesting provides scheduled deep-dive sessions, continuous in-app feedback collection fills the gaps between research rounds. A Net Promoter Score (NPS) survey is the most efficient single-question feedback mechanism: asking users 'How likely are you to recommend this to a friend?' on a 0-10 scale takes 30 seconds and gives you actionable segmentation data (Detractors 0-6, Passives 7-8, Promoters 9-10). The best time to show an NPS survey is after a user experiences enough of your product to have an informed opinion — typically after 3-5 active sessions, after completing onboarding, or after reaching a key value moment (their first successful export, first project shared, first successful API call). Showing it too early (first visit) collects meaningless data. Display the NPS survey as a modal or slide-up panel to avoid disrupting the current workflow. Use localStorage to ensure it only shows once per user — checking both localStorage and your database prevents it from reappearing after a user clears browser storage. Store responses in Supabase with the user ID, score, comment, app version, and the page/feature they were using when the survey appeared. Review NPS responses weekly in your Supabase dashboard. Create a simple Supabase query: SELECT score, comment, created_at FROM nps_responses ORDER BY created_at DESC LIMIT 50. For qualitative analysis, sort by score and read the comments in each segment — Detractor comments (0-6) tell you what to fix, Promoter comments (9-10) tell you what to amplify in your marketing.
Create a complete NPS survey system. First, create a Supabase table nps_responses with columns: id (uuid, primary key), user_id (uuid, references auth.users), score (integer 0-10), comment (text, nullable), page_url (text), created_at (timestamptz, default now()). Enable RLS with a policy that allows authenticated users to insert their own responses. Then create a React component NpsSurvey that: shows after a 5-minute delay using setTimeout, checks localStorage for 'nps_shown' before showing, displays a 0-10 rating scale with visual highlight on selection, shows a text area for optional comment after score selection, submits to /api/nps via POST, sets localStorage 'nps_shown' = true on submit, and dismisses on close.
Paste this in Bolt.new chat
1// app/api/nps/route.ts2import { NextResponse } from 'next/server';3import { createRouteHandlerClient } from '@supabase/auth-helpers-nextjs';4import { cookies } from 'next/headers';56export async function POST(request: Request) {7 const supabase = createRouteHandlerClient({ cookies });8 const { data: { user } } = await supabase.auth.getUser();910 if (!user) {11 return NextResponse.json({ error: 'Unauthorized' }, { status: 401 });12 }1314 const body = await request.json() as { score: number; comment?: string; page_url?: string };1516 if (typeof body.score !== 'number' || body.score < 0 || body.score > 10) {17 return NextResponse.json({ error: 'Score must be 0-10' }, { status: 400 });18 }1920 const { error } = await supabase.from('nps_responses').insert({21 user_id: user.id,22 score: body.score,23 comment: body.comment ?? null,24 page_url: body.page_url ?? '',25 });2627 if (error) {28 return NextResponse.json({ error: error.message }, { status: 500 });29 }3031 return NextResponse.json({ success: true });32}Pro tip: Add a cron job (Supabase Edge Functions with pg_cron, or a weekly manual query) to calculate your NPS score: NPS = (% Promoters) - (% Detractors). Track this monthly. An NPS above 50 is excellent for a product; above 70 is exceptional. Use the trend direction as a leading indicator of retention.
Expected result: The NPS survey appears after 5 minutes of use for users who haven't been surveyed. On submission, responses are stored in Supabase with the user ID, score, and comment. The LocalStorage flag prevents the survey from showing again. Query nps_responses in Supabase Table Editor to verify data is being captured.
Build a Contextual Session Feedback Form
Build a Contextual Session Feedback Form
UserTesting sessions give deep qualitative insight but require users to be available for a scheduled session. A contextual feedback widget that appears after specific user actions captures feedback in the moment — when the user's experience is freshest and most accurate. The most effective contextual feedback trigger points: after an error (the user just hit a bug — ask immediately what they were trying to do), after completing a complex flow (onboarding, checkout, project setup), after the user has been idle for 5+ minutes on a page (they may be stuck), or as an exit intent overlay when the user moves to close the tab. For a simple implementation, create a floating feedback button (a question mark icon in the bottom-right corner) that users can click any time to leave feedback. This respects user autonomy better than popup surveys and consistently captures feedback from the most engaged users — people who are frustrated enough or impressed enough to proactively report their experience. Store feedback with context: the page URL, user ID, timestamp, app version (from environment variables), and the user's session duration. This context transforms raw feedback into actionable insights. 'The export button doesn't work' is useful; 'The export button doesn't work on the /projects/[id]/settings page, reported by a Pro plan user after 18 minutes in the session' is a bug report your engineering team can act on immediately. Note: Bolt's WebContainer cannot receive incoming webhooks, but this feedback form submits to a Next.js API route — an outbound POST that works in both the preview and after deployment.
Create a floating feedback widget for my app. Build a FeedbackWidget component with: a circular button in the bottom-right corner with a speech bubble icon, a slide-up panel that appears on click with a textarea for feedback and a radio group for sentiment (happy/neutral/frustrated), a submit button that POSTs to /api/feedback with the message, sentiment, current URL, and optional user ID from auth context. Create the /api/feedback API route that stores feedback in a Supabase 'feedback' table with columns: id, user_id (nullable), message, sentiment, page_url, created_at. Show a 'Thank you!' message on successful submit and auto-close after 2 seconds.
Paste this in Bolt.new chat
1// components/FeedbackWidget.tsx (simplified structure)2'use client';34import { useState } from 'react';56type Sentiment = 'happy' | 'neutral' | 'frustrated';78export function FeedbackWidget() {9 const [open, setOpen] = useState(false);10 const [message, setMessage] = useState('');11 const [sentiment, setSentiment] = useState<Sentiment>('neutral');12 const [submitted, setSubmitted] = useState(false);1314 const submit = async () => {15 await fetch('/api/feedback', {16 method: 'POST',17 headers: { 'Content-Type': 'application/json' },18 body: JSON.stringify({19 message,20 sentiment,21 page_url: window.location.href,22 }),23 });24 setSubmitted(true);25 setTimeout(() => {26 setOpen(false);27 setSubmitted(false);28 setMessage('');29 }, 2000);30 };3132 return (33 <div style={{ position: 'fixed', bottom: 24, right: 24, zIndex: 1000 }}>34 {!open && (35 <button36 onClick={() => setOpen(true)}37 style={{ borderRadius: '50%', width: 48, height: 48 }}38 aria-label="Give feedback"39 >40 ��41 </button>42 )}43 {open && (44 <div style={{ background: 'white', border: '1px solid #e2e8f0', borderRadius: 12, padding: 16, width: 300 }}>45 {submitted ? (46 <p>Thank you for your feedback!</p>47 ) : (48 <>49 <textarea50 value={message}51 onChange={(e) => setMessage(e.target.value)}52 placeholder="What's on your mind?"53 rows={4}54 style={{ width: '100%', marginBottom: 8 }}55 />56 <div style={{ display: 'flex', gap: 8, marginBottom: 12 }}>57 {(['happy', 'neutral', 'frustrated'] as Sentiment[]).map((s) => (58 <button key={s} onClick={() => setSentiment(s)}59 style={{ fontWeight: sentiment === s ? 'bold' : 'normal' }}>60 {s === 'happy' ? '😊' : s === 'neutral' ? '😐' : '😤'}61 </button>62 ))}63 </div>64 <button onClick={submit} disabled={!message.trim()}>Send</button>65 <button onClick={() => setOpen(false)} style={{ marginLeft: 8 }}>Cancel</button>66 </>67 )}68 </div>69 )}70 </div>71 );72}Pro tip: Review feedback submissions weekly in Supabase Table Editor. Sort by sentiment = 'frustrated' and created_at DESC to see your most recent user frustration points. Look for patterns: if 10 users report frustration on the same page_url, that page needs UX work.
Expected result: The floating feedback button appears on every page. Users can submit feedback with a sentiment rating. Responses appear in the Supabase 'feedback' table with the page URL and timestamp. The widget closes automatically after submission.
Integrate FullStory as an Automated Alternative to UserTesting
Integrate FullStory as an Automated Alternative to UserTesting
For Bolt developers who want automated session observation without UserTesting's scheduling overhead and per-session costs, FullStory provides the closest equivalent: actual video recordings of user sessions captured automatically in production. Where UserTesting shows you 5-10 recruited users completing prescribed tasks, FullStory shows you thousands of real sessions from actual users doing actual things. The trade-off: UserTesting provides narration (users verbalize what they're thinking), which reveals user mental models and confusion points that silent session replay cannot surface. FullStory shows you what users do; UserTesting shows you why. For a complete UX research program, both are valuable. For early-stage Bolt apps on a budget, FullStory's automatic capture often reveals the same usability issues as UserTesting studies. Adding FullStory to your Bolt app requires the @fullstory/browser npm package and your FullStory Org ID. Install the package, initialize with your org ID in the app entry point, and FullStory begins recording all sessions automatically. The FullStory dashboard provides session search (find sessions where users rage-clicked, hit errors, or spent more than 5 minutes on a specific page) and DX Data (ML-powered frustration signal detection) that mimics some of UserTesting's insight-generation without manual session scheduling. Note: FullStory's browser SDK sends data directly from the user's browser to FullStory's servers over HTTPS, so it works in Bolt's WebContainer during development for initialization testing. Full session recording activates on your deployed production URL.
Add FullStory session replay to my app as an automated alternative to scheduled UserTesting sessions. Install @fullstory/browser. Create a lib/fullstory.ts file that initializes FullStory with the org ID from VITE_FULLSTORY_ORG_ID environment variable. Set devMode: true in development. Initialize in App.tsx using useEffect on mount. After my users log in, call FullStory.identify() with their user ID and email so I can filter sessions by user in the FullStory dashboard. Add VITE_FULLSTORY_ORG_ID=YOUR_ORG_ID to .env.
Paste this in Bolt.new chat
1// lib/fullstory.ts2import * as FullStory from '@fullstory/browser';34let initialized = false;56export function initFullStory(orgId: string) {7 if (initialized || !orgId) return;8 FullStory.init({9 orgId,10 devMode: import.meta.env.DEV,11 });12 initialized = true;13}1415export function identifyUser(userId: string, email: string, plan = 'free') {16 if (!initialized) return;17 FullStory.identify(userId, {18 displayName: email,19 email,20 plan,21 });22}2324export function anonymizeUser() {25 if (!initialized) return;26 FullStory.anonymize();27}2829// In App.tsx:30// useEffect(() => {31// initFullStory(import.meta.env.VITE_FULLSTORY_ORG_ID);32// }, []);Pro tip: Use FullStory's segment builder to create a 'Frustrated Users' segment: sessions where users rage-clicked more than 3 times in a single session. Watch these sessions to understand which UI elements are causing frustration — this directly parallels what UserTesting would surface through user narration.
Expected result: FullStory initializes without errors in the Bolt preview. After deploying and receiving real users, session recordings appear in the FullStory dashboard. You can search sessions by user ID, URL, and frustration signals. This provides continuous passive observation between scheduled UserTesting sessions.
Common use cases
Conduct Unmoderated User Testing on a Deployed Bolt App
You've built a Bolt app and want real users to test it before launch. Deploy to Bolt Cloud or Netlify to get a stable URL. Create a UserTesting study with specific tasks (find the settings page, complete a purchase, invite a team member) and share the URL with the UserTesting panel. Watch recordings in the UserTesting dashboard. No API integration required — UserTesting works with any publicly accessible URL.
Help me prepare my Bolt app for UserTesting. Make sure the app is production-ready for external testers: check that all navigation flows work, add loading states for slow operations, ensure error messages are user-friendly (not technical error codes), and add a visible 'demo mode' banner so testers know they're on a test version. Also create a separate /demo route with pre-populated sample data so testers don't need to create accounts or enter real information.
Copy this prompt to try it in Bolt.new
Build an In-App NPS Survey Widget
Instead of waiting for scheduled UserTesting sessions, collect continuous feedback from real users with a Net Promoter Score widget that appears after key actions. Show users a 0-10 scale asking 'How likely are you to recommend this app to a friend?' and an optional text field for their reason. Save responses to your database and review them weekly. This provides continuous qualitative feedback between formal UserTesting rounds.
Build an NPS (Net Promoter Score) survey widget for my Bolt app. Create a NpsSurvey React component that appears as a modal overlay after a user has been active for 5 minutes (use a setTimeout). Show a 0-10 scale for 'How likely are you to recommend this to a friend?' and an optional text input for their reasoning. On submit, POST to /api/nps with the user's ID (from Supabase Auth), their score (0-10), their comment text, and the current page URL. Store in a Supabase 'nps_responses' table. Show the widget only once per user (check localStorage for 'nps_shown').
Copy this prompt to try it in Bolt.new
Embed Typeform for Contextual User Research
Typeform provides a conversational survey experience with branching logic that UserTesting lacks. Embed a Typeform survey as a popup or side panel in your Bolt app, triggered after specific user actions (completing onboarding, hitting an error, reaching a certain feature). Typeform's Webhooks deliver responses to your Next.js API route in real-time, populating your user research database automatically.
Embed a Typeform survey popup in my Bolt app that appears after a user completes onboarding. Get the Typeform form ID from my Typeform account (form URL format: typeform.com/to/FORM_ID). Create a TypeformEmbed React component that renders the Typeform widget using @typeform/embed-react. Show it after the onboarding completion event fires. Include a hidden field that passes the user's ID to the Typeform response for matching feedback to user accounts.
Copy this prompt to try it in Bolt.new
Troubleshooting
UserTesting testers cannot access my Bolt app — they see an error or blank page
Cause: The Bolt WebContainer preview URL was shared instead of the deployed URL, or the app was deployed but has a runtime error that only surfaces for new users (missing auth cookies, pre-populated state that development depended on).
Solution: Always share your deployed URL (yourapp.bolt.host or yourapp.netlify.app) with UserTesting — never the preview URL. Test the deployed URL in an incognito window without any auth cookies to simulate a new user's experience. Check the browser console for JavaScript errors on first load.
NPS survey appears multiple times for the same user
Cause: The localStorage check for 'nps_shown' only works per-browser. If the user clears storage, uses a different browser, or is on mobile, they see the survey again.
Solution: Add a server-side check in addition to localStorage: query your nps_responses table for an existing response from the current user before showing the survey. The API route can check the database and return { already_responded: true } if a response exists.
1// Add to your NPS check logic:2const checkNpsStatus = async (userId: string) => {3 const { data } = await supabase4 .from('nps_responses')5 .select('id')6 .eq('user_id', userId)7 .limit(1)8 .single();9 return !!data; // true = already responded10};Feedback widget POSTs to /api/feedback but gets 401 Unauthorized
Cause: The /api/feedback route requires authentication via Supabase Auth, but the request doesn't include the auth cookie because the component is making a plain fetch() without the credentials: 'include' option.
Solution: The createRouteHandlerClient from @supabase/auth-helpers-nextjs reads cookies from the request. Ensure the fetch call includes credentials: 'include' (or use the Supabase client from the browser-side which handles cookies automatically). Alternatively, make user_id nullable in your feedback table and accept anonymous feedback.
1// Include credentials in the fetch call:2await fetch('/api/feedback', {3 method: 'POST',4 headers: { 'Content-Type': 'application/json' },5 credentials: 'include', // Include auth cookies6 body: JSON.stringify({ message, sentiment, page_url }),7});Best practices
- Always deploy to a production URL (Bolt Cloud or Netlify) before UserTesting sessions — the WebContainer preview URL is not suitable for external testers
- Create a dedicated demo account with pre-populated data for UserTesting sessions rather than asking testers to create real accounts
- Show NPS surveys after users experience value moments (completed onboarding, first successful action), not immediately on first visit
- Store user feedback with context (page URL, session duration, user plan) to make raw feedback actionable without requiring manual follow-up
- Combine UserTesting (scheduled qualitative) with FullStory (automated quantitative session replay) for a complete UX research program
- Reset demo data between UserTesting sessions using a /api/demo/reset endpoint to ensure each tester starts with a clean state
- Review NPS responses weekly — sort by Detractor scores first to identify the highest-impact UX issues to fix before your next UserTesting round
Alternatives
FullStory provides automatic session replay for all real users continuously — capturing the same 'watch users navigate your app' insights as UserTesting but passively, without scheduling sessions or paying per participant.
Crazy Egg provides aggregate heatmaps and scroll maps showing where all users click — useful for identifying confusing UI elements at scale, complementing the individual-session insight from UserTesting.
SurveyMonkey provides more structured survey creation with branching logic and statistical analysis than a custom NPS widget, suitable for formal user research between UserTesting sessions.
Typeform creates conversational, engaging surveys with higher completion rates than traditional forms — embed Typeform in your Bolt app for post-task feedback instead of scheduling UserTesting sessions for simple preference studies.
Frequently asked questions
Does UserTesting have an API I can use in my Bolt.new app?
UserTesting's API is enterprise-only and not publicly documented. There is no publicly available API for creating studies, managing panelists, or retrieving session recordings programmatically. The practical integration with Bolt.new is sharing your deployed app URL as the test URL in UserTesting's study setup dashboard — no code integration required.
Can UserTesting record sessions from Bolt's WebContainer preview?
No. UserTesting panelists open your URL in their own browser. The Bolt WebContainer preview URL is session-specific and requires Bolt authentication context to load — external testers cannot access it. Always deploy to Bolt Cloud, Netlify, or another hosting platform and share the deployed URL with UserTesting. Test the URL yourself in an incognito window before sharing it with testers.
What's the difference between UserTesting and FullStory?
UserTesting is a UX research platform where recruited participants complete tasks in scheduled sessions while narrating their thoughts — providing rich qualitative insight about user mental models. FullStory is an automatic session recording tool that captures all real user sessions passively. UserTesting shows you why users struggle; FullStory shows you that users struggle and where. For early-stage Bolt apps, FullStory is often more practical than UserTesting due to its lower cost and automatic operation.
How do I collect user feedback without UserTesting?
Build a custom feedback widget in your Bolt app (floating button → slide-up form → Supabase table), add an NPS survey that triggers after users experience value, or embed Typeform for structured surveys. These approaches collect continuous feedback from real users without the scheduling overhead of UserTesting. Combine with FullStory session replay for qualitative observation and you cover most of what UserTesting provides at lower cost.
What should I include in UserTesting task instructions for a Bolt app?
Provide clear, task-focused instructions without telling users how to complete the task (you want to observe their natural approach). Include login credentials for your demo account, specific tasks to attempt ('find the export function', 'upgrade to the Pro plan'), and any background context the tester needs. Keep tasks to 3-5 to fit within UserTesting's typical 20-30 minute session. Avoid leading questions like 'how easy is the checkout button to find?' — say instead 'please purchase a Pro subscription'.
Talk to an Expert
Our team has built 600+ apps. Get personalized help with your project.
Book a free consultation