How I Migrated Quaslation's Database from Supabase to Neon

A comprehensive step-by-step guide to migrating a PostgreSQL database from Supabase to Neon, including export, import, verification, and rollback strategies.

How I Migrated Quaslation's Database from Supabase to Neon
July 25, 2025
6 min read
Updated Jul 25, 2025
postgresql supabase neonlabs database migration

Introduction to the Migration

When I started working on my project Quaslation, I was using Supabase as the PostgreSQL database provider. Supabase served me well initially, but as my project grew, I wanted to explore Neon for its serverless capabilities, branching features, and modern PostgreSQL hosting.

This blog post walks you through how I tackled this migration step by step — from planning and export to import, verification, and rollback readiness — using detailed documentation and custom scripts.

Why the Migration?

I wanted to:

  • Take advantage of Neon’s branching for safer migrations.
  • Optimize my database costs with Neon’s serverless architecture.
  • Better align with my project’s evolving infrastructure.

However, migrating a live database is always risky, so I planned every detail meticulously.

Step 1: Documenting the Database Schema

Before writing any scripts, I created a complete schema doc that listed every table — User, RichText, Novel, Volume, Chapter, and PrismaMigrations — including columns, types, and relationships. This documentation served as the foundation for the entire migration process, ensuring that I understood the complete data structure before making any changes.

Having a comprehensive schema document is crucial because it helps you:

  • Understand the complete data structure
  • Identify dependencies between tables
  • Plan the export/import order correctly
  • Create verification scripts that can validate data integrity

Step 2: Writing the Export Process

I wrote an export script that connects to Supabase using the pg package. It exports each table as a separate JSON file. The export order follows the dependency hierarchy:

Export Strategy

Phase 1: Independent tables

  • RichText
  • User
  • PrismaMigrations

Phase 2: Dependent tables

  • Novel → Volume → Chapter

The export outputs JSON files to a git-ignored coverage/database-export/ folder to prevent accidental commits of sensitive data. This is a critical security measure that ensures your database data never gets committed to version control.

Key Considerations for Export

  • Dependency Order: Export tables in the correct order to respect foreign key relationships
  • Data Sanitization: Ensure sensitive data is properly handled
  • File Organization: Use a structured folder system for different table exports
  • Error Handling: Implement robust error handling for connection issues and data corruption

Step 3: Planning the Import Process

On the Neon side, I designed an import plan that reverses the export order. I ensured that foreign key constraints were temporarily disabled during import to prevent violations. The import script reads the JSON files and inserts the data in the correct sequence.

Import Strategy

Key considerations:

  • Use SET session_replication_role = replica to disable constraints temporarily
  • Re-enable constraints after import
  • Reset sequences for serial IDs
  • Handle data type conversions if necessary

Constraint Management

Disabling foreign key constraints during import is essential because:

  • It allows you to import data in any order (not necessarily dependency order)
  • It prevents constraint violations during the import process
  • It significantly speeds up the import process
  • It allows you to handle data that might temporarily violate constraints

However, you must remember to re-enable constraints after import and verify that all relationships are intact.

Step 4: Verifying Data Integrity

I created a verification script to:

  • Check if JSON files match expected row counts
  • Validate JSON format
  • Generate a verification report for manual review

I also tested row counts after import to ensure data matched.

Verification Process

The verification process involved several steps:

  1. Pre-import verification: Check exported JSON files for completeness and format
  2. Post-import verification: Compare row counts between source and destination databases
  3. Relationship verification: Ensure foreign key relationships are maintained
  4. Data validation: Spot-check actual data values to ensure integrity

Automated Verification

Automated verification is crucial because it provides:

  • Consistency: Ensures the same verification process every time
  • Speed: Much faster than manual verification
  • Documentation: Creates a record of what was verified and when
  • Early Detection: Catches issues early in the process

Step 5: Preparing a Rollback Plan

No migration is complete without a fallback. I drafted a rollback plan detailing:

Rollback Strategies

Hot rollback (immediate switch back to Supabase)

  • Requires application configuration changes
  • Minimal downtime
  • Best for critical applications

Staged rollback (schedule downtime, restore backups)

  • Requires planned maintenance window
  • More thorough testing possible
  • Best for non-critical applications

Partial rollback (specific tables)

  • Targeted approach for specific issues
  • Minimizes affected data
  • Best for localized problems

I backed up the entire Supabase database using pg_dump before starting. This backup serves as the ultimate safety net, allowing me to restore the database to its exact state before the migration if needed.

Step 6: Final Migration

Once I verified all scripts locally and on a staging Neon database, I ran:

Terminal window
npm run db:migrate:full

This command coordinated:

  1. Export from Supabase
  2. Import into Neon
  3. Verification of row counts and relationships

All logs and manifest files were written to coverage/database-export/ for auditing.

Migration Execution

The final migration process involved:

  1. Pre-migration checks: Verify all systems are ready
  2. Export execution: Run the export script with full logging
  3. Import execution: Run the import script with constraint handling
  4. Post-migration verification: Run comprehensive verification scripts
  5. Application testing: Test the application with the new database
  6. Cutover: Switch application to use Neon database

Lessons Learned

Through this migration process, I learned several valuable lessons:

Planning and Strategy

  • Plan the export/import order carefully to maintain foreign keys
  • Document everything - schema, processes, decisions, and issues
  • Test thoroughly in a staging environment before production
  • Have multiple rollback options available

Technical Implementation

  • Automate checks with clear logs for easier debugging
  • Always keep sensitive exports out of version control
  • Handle edge cases like null values, special characters, and large datasets
  • Monitor performance during both export and import phases

Project Management

  • Schedule adequate time - migrations often take longer than expected
  • Communicate with stakeholders about potential downtime
  • Have a rollback timeline - know when to abort and revert
  • Document the process for future reference and team knowledge

Conclusion

The migration went smoothly, and Quaslation now runs on Neon! This migration provided several benefits:

  • Cost optimization: Neon’s serverless architecture reduced database costs
  • Enhanced features: Access to Neon’s branching and advanced PostgreSQL features
  • Better performance: Improved query performance and scalability
  • Future-proofing: A more modern database infrastructure for future growth

If you’re planning a similar migration, feel free to learn from my experience and check out the detailed documentation I created. The key to a successful migration is thorough planning, careful execution, and comprehensive verification.

Happy shipping 🚀

Additional Resources

For those considering a similar migration, I recommend exploring:

Remember that every migration is unique, so adapt these lessons to your specific use case and requirements.