I’m not sure how common this problem is, but I use few AI tools to build my app (Lovable and Gemini Code Assistant) and as the result I can’t naturally apply migrations sql files that are auto-generated by these tools (it could be also what I’ve done manually outside of these scripts)
This prevent for me to naturally bring remote Supabase locally with CLI commands liike:
supabase db reset
supabase link --project-ref {supabase-project-id}
supabase db pull --linked
supabase db push
Note: first command will reset your local Supabase so please use this if and only if you are not happy with the state of local Supabase
If this works, then good for you! You must be keeping good care of supabase/migrations/*.sql files but when I run into issues and this article is trying to explain how I get around that.
My thinking
Since there is perfectly working remote Supabase, why can’t I just somehow ignore these bad/invalid migration scripts? There must be a way!
Let’s get schemas/tables first
Note: I wish I capture all the prompts I used, as well as some of the response I got from Gemini Code Assistant before it crash (not sure why, but I started to see this happening way more recently …) so some information I’m sharing below might not be 100% accurate but hopefully still good enough to give reader the idea of how I resolve this bad/invalid migration issue.
So I started to ask Gemini Code Assistant about how I can do this and after few going back and force, it actually suggested to:
- destroy docker volume
- create dummy migration files (I renamed supabase/migrations folder to something like supabase/migrations_old)
That way, when we pull schema off of remote Supabase and then (it think) applying all the migration files, it’s no longer error out (now that I think about it, you can probably just empty file content too) – and it should create one new migration script in the end that contains all the schema. You then ask it to squash all migration into single file – and wola, you now successfully clean up your migration history.
I think I then need to restart supabase instance to pick up this migration sql and after that, finally I can see all the tables!
Btw, here is a script that generated for me – I’ll warn you one more time, this will delete your local Supabase so use this if and only if you are ready to recreate your local Supabase
#!/bin/bash
set -euo pipefail
echo "⚠️ This will WIPE your LOCAL Supabase database(s) and resync schema from PROD."
read -p "Are you sure? (yes/no): " confirm
if [[ "$confirm" != "yes" ]]; then
echo "Aborted."
exit 1
fi
# Stop local Supabase
supabase stop || true
# Remove local db volumes
db_volumes=$(docker volume ls --format '{{.Name}}' | grep '^supabase_db_') || true
if [[ -n "$db_volumes" ]]; then
echo "Removing volumes:"
echo "$db_volumes" | xargs docker volume rm || true
else
echo "No supabase_db_* volumes found."
fi
# Start local supabase (empty db, no migrations)
supabase start
# Dump schema from prod (read-only)
supabase db pull --linked --schema public
# Push schema.sql into local DB
supabase db push
echo "✅ Local Supabase is now synced with PROD schema (migrations ignored)."
and to run this, save this at project root, give it a name (I name it synch-local-db.bash
) and execute permission (chmod +x {filename}
) and then run with
./synch-local-db.bash
Populate data next
For me my remote Supabase contains relatively little amount of data (as it is still just in beta release), easiest way for me to get data is by running this command:
supabase db dump --data-only > supabase/seed.sql
psql "postgresql://postgres:postgres@127.0.0.1:54322/postgres" -f supabase/seed.sql
Download Edge Functions
For Edge functions, easiest way is to get list of functions with:
supabase functions list —project-ref {supabase-project-id}
and then for each function you can run
supabase functions download {function-name} --project-ref {supabase-project-id}
(btw, you can easily tell AI to generate these for you and concatenate them with “&&” to run it in one shot)
At this point, (hopefully) you should have pretty good local Supabase, other than secret (which you most likely should manage via .env or .env.local) and Google Auth set up, which I’m almost got it working but still working on the final part so I’ll plan to write about it whenever I figure everything so please stay tuned!