Glenn Jones

Hello 👋 Welcome to my corner of the internet. I write here about the different challenges I encounter, and the projects I work on. Find out more about me.

Import your production data from dokku-postgres

Whether it be to test, debug how to cache your site to make it faster (read nate berkopec’s post), or to simply save time setting up seed data, it can be useful to import your production data to local.

Since I’m part of a small product shop, we do most of our deploys with dokku. Recently we started using the dokku-postgres plugin to manage postgres databases. Dokku-postgres also makes it really easy to import the production or staging database to local, given you’ve set up ssh.

Disclaimer: I guess this works for relatively small projects. Eg: we use it to import the < 1 year old data from a small project (10$/month DigitalOcean droplet)

Put the following script in /lib/get_db.sh, paste the correct values for ssh_host, db_dev, db_to_import, prod_uploads_dir and you should be good to go.

The script will first dump the database, send it to your pc with rsync and then ‘restore’ it with pg_restore. If you have an uploads folder that’s relevant, this will also get transferred.

You run the script with sh lib/get_db.sh, in the root of your project.

#!/bin/bash

ssh_attrs=""
# your_host can be an ip or domain
ssh_host="root@your_host"
ssh="ssh $ssh_host $ssh_attrs"
db_dev="local_db_name"
db_to_import="production_db_name"

$ssh "dokku postgres:export $db_to_import > /tmp/$db_to_import.dump"
rsync -avz -e "ssh $ssh_attrs" $ssh_host:/tmp/$db_to_import.dump /tmp/$db_to_import.dump
pg_restore -O -d $db_dev /tmp/$db_to_import.dump

# then we download the uploads folder, which we’ve mounted as persistent storage in our dokku app, see http://glennjon.es/2016/10/24/adding-persistent-storage-to-dokku.html for more information

# prod_uploads_dir should be the dir that you've mounted the persistent storage to, use `dokku storage:list name_of_your_dokku_app` on your server to see that

prod_uploads_dir="/var/lib/dokku/data/storage/name_of_your_dokku_app/uploads/"

ls -d public/uploads
if [ $? -eq 0 ]
then
    rsync -avz -e "ssh $ssh_attrs" $ssh_host:$prod_uploads_dir public/uploads
fi

Bonus:

I’ve also added a rake task to deal with dropping, creating and migrating the database in accordance with the script:

namespace :db do
    task import_prod: :environment do
        Rake::Task["db:drop"].execute
        puts "dropped current local db"
        Rake::Task["db:create"].execute
        puts "created new local db"
        puts "importing production data ..."
        `sh lib/get_db.sh`
        puts "imported production data"
        Rake::Task["db:migrate"].execute
    end
end

Now, call rake db:import_prod to make your local data be a copy of your production data.

Links

Previous: A straightforward og-tags and meta data pattern in rails 4
Next: Letsencrypt free https for custom domain on gitlab pages with middleman