Skip to content

Instantly share code, notes, and snippets.

@axoplasm
Last active December 10, 2023 20:27
Show Gist options
  • Save axoplasm/d96731f73d4a1b387902d5445c6260b4 to your computer and use it in GitHub Desktop.
Save axoplasm/d96731f73d4a1b387902d5445c6260b4 to your computer and use it in GitHub Desktop.
HOWOTO migrate a Django Wagtail site on Digital Ocean App Platform
  1. Create Django App per DO tutorial or the DO quickstart (older)
    • ensure you have set the following LOCAL environment variables, e.g. in settings.local or other means:
import os
os.environ.setdefault('DEVELOPMENT_MODE', 'True')
os.environ.setdefault('DEBUG', 'True')
  1. Create Spaces (S3 Bucket) for media
  2. (upload media to new space)
  3. Optional settings for the new storage space if you want to use a custom endpoint:
    • In the Space settings, add a CORS configuration for the base domain (not sure this necessary)
    • Enable CDN for the new space
    • Add a new subdomain e.g. meda.example.net to the space
  4. Get access key for space in API -> Spaces Keys
  5. Add environment variables for media — rough guide in this DO tutorial
  6. pip install django-storages boto3; pip freeze > requirements.txt
  7. update settings.py, probably like the following:
# S3 (spaces) storage

# these all do what I expect
AWS_ACCESS_KEY_ID = os.getenv('AWS_ACCESS_KEY_ID')
AWS_SECRET_ACCESS_KEY = os.getenv('AWS_SECRET_ACCESS_KEY')
AWS_DEFAULT_ACL = 'public-read' # public files by default
AWS_QUERYSTRING_AUTH = False # dont add validation querystrings
AWS_S3_FILE_OVERWRITE = False # append characters to dupe filenames

# need this!
AWS_STORAGE_BUCKET_NAME = os.getenv('AWS_STORAGE_BUCKET_NAME')

# not sure how these are put together
AWS_S3_ENDPOINT_URL = os.getenv('AWS_S3_ENDPOINT_URL')
AWS_S3_CUSTOM_DOMAIN = os.getenv('AWS_S3_CUSTOM_DOMAIN')
AWS_S3_OBJECT_PARAMETERS = {
    'CacheControl': 'max-age=86400',
}


# for user uploads
if DEBUG:
    DEFAULT_FILE_STORAGE = 'django.core.files.storage.FileSystemStorage'
else:
    DEFAULT_FILE_STORAGE = 'storages.backends.s3boto3.S3Boto3Storage'
    MEDIA_URL = f"{AWS_S3_ENDPOINT_URL}/" # not sure this matters in prod?
  1. Configure Domains:
    • App > Settings > Domains > Add domain
    • If your nameservers are already on DO, “We manage your domain” may be safe
      • it appears they just append to existing records
      • MX records may be disrupted(? temporarily?)
  2. Set ALLOWED_HOSTS to allow traffic from all those domains
    • append to ALLOWED_HOSTS in settings.py, e.g. ALLOWED_HOSTS += os.getenv("DJANGO_ALLOWED_HOSTS", "127.0.0.1,localhost").split(",") (note +=)
    • ...or add them to the environment variable DJANGO_ALLOWED_HOSTS, comma-separated with no spaces
      • this seems more reliable
    • OK to add wildcard domains e.g. ${APP_DOMAIN},example.com,.example.net ?
  3. Create a custom endpoint for media
    • don’t forget to add this endpoint to AWS_S3_CUSTOM_DOMAIN in the environment variables
  4. Upload legacy DB dump probably following this HOWTO — did this manually from a json file using ./manage.py loaddata but this is not ideal. WIP:
    1. create a new empty db
    2. temporarily disable trusted sources on that db OR (better) add the local computer to trusted sources
    3. This seemed to work: pg_restore -d <your_connection_URI> --no-owner --jobs 4 <path/to/your_dump_file.pgsql>
    4. point the db environment variable to the new db & rebuild (?)
    5. Do NOT run migrate
  5. Encrypt sensitive environment variables in the GUI

Dumping remote DB for local development

  1. Disable Trusted Sources TODO: find more secure means of doing this
  2. Copy the connection settings from the REMOTE database
  3. Dump the db on your local: pg_dump -h <remote_host> -p <port> -U <remote_user> -Fc <remote_database> > path/to/dbdump.psql
    • You will be prompted for a password, have that handy
  4. Re-enable Trusted Sources
  5. Create a new, empty DB on your local.
    • Do NOT run migrate
  6. Load the exported db to the local: pg_restore -h localhost -U <local_user> -d <local_database> data/dbdump.pgsql

for future research:

  • Private file service on S3
  • redirect away from wildcard domains (e.g. to www)
  • build commands e.g. ./manage.py migrate (Here’s is a hint anyway)
  • need more info on DNS/domains
  • periodically dumping DB for local dev
  • sync media for local dev
  • automated spaces backup

other resources

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment