This guide covers deploying a Docker
image of a Rails
and PostgreSQL
application on Google Cloud Platform using Cloud Build
, Container Registry
, Cloud Key Management Service
, Cloud Run
, Cloud SQL
, and Cloud Store
. The following section provides links to a detailed guide created by a Google Developer Advocate which was the base for my build, but needed to be adjusted.
Google Cloud Run on Rails: a real life example Part 1
Google Cloud Run on Rails: a real life example Part 2
Google Cloud Run on Rails: a real life example Part 3
Google Cloud Run on Rails: a real life example Part 4
Create a new Google Cloud Platform project
- Google Cloud Platform :: Console
- Click on the Project drawdown menu and select “Create New Project”
- Name project and save
- On the project overview dashboard copy the Project ID for use during Step 2
- Ensure Google SDK is installed
- Google Cloud SDK
gcloud components install
- From the project root directory authorize GCP
gcloud auth login your_account_email
- Set the current GCP project
gcloud config set project your_project_id
Create a new Cloud SQL
instance
- From the GCP Console Navigation Menu
- Select the
SQL
app- Select the
PostgreSQL
option- Set the database name
- Set the database password
- Save and create
- Select the
- Select the
Connecting from Cloud Run (fully managed) to Cloud SQL
Open and Edit: config/database.yml
production:
<<: *default
database: my_app_production
username: <%= ENV['DATABASE_USER'] %>
password: <%= ENV['DATABASE_PASSWORD'] %>
# Copy and paste Cloud SQL instance 'Connection name'
socket: “/cloudsql/project_id:us-central1:sql-instance-name”
Follow this section of the guide: GCP Environment, Keys, Active Storage
Rails Documentation: Active Storage Overview — Ruby on Rails Guides
I used GCP Cloud Build, Container Registry, and Cloud Run to deploy the production application. My goal was to create the most slim and efficient containers possible. The following article was very helpful:
Here is my cloudbuild.yml
file:
steps:
# Decrypt Rails production key file
- name: gcr.io/cloud-builders/gcloud
args: ["kms", "decrypt", "--ciphertext-file=./config/credentials/production.key.enc",
"--plaintext-file=./config/credentials/production.key",
"--location=us-central1", "--keyring=farm-link-secrets",
"--key=rails_key"]
# Decrypt Rails master key file
- name: gcr.io/cloud-builders/gcloud
args: ["kms", "decrypt", "--ciphertext-file=./config/master.key.enc",
"--plaintext-file=./config/master.key",
"--location=us-central1", "--keyring=farm-link-secrets",
"--key=rails_key_master"]
# Build image with tag 'latest' and pass decrypted Rails DB password as argument
- name: 'gcr.io/cloud-builders/docker'
args: ['build', '--tag', 'gcr.io/farm-link-284523/farm_link:$SHORT_SHA',
'--build-arg', 'DB_PASS', '--build-arg', 'MASTER_KEY',
'--file=./Dockerfile.slim', '.']
secretEnv: ['DB_PASS', 'MASTER_KEY']
# Push new image to Google Container Registry
- name: 'gcr.io/cloud-builders/docker'
args: ['push', 'gcr.io/farm-link-284523/farm_link:$SHORT_SHA']
# Deploy container image to Cloud Run
- name: 'gcr.io/google.com/cloudsdktool/cloud-sdk'
entrypoint: gcloud
args: ['run', 'deploy', '${_SERVICE_NAME}',
'--image', 'gcr.io/farm-link-284523/farm_link:$SHORT_SHA',
'--region', 'us-central1',
'--platform', 'managed']
timeout: 900s
substitutions:
_SERVICE_NAME: farm-link-staging
secrets:
- kmsKeyName: projects/farm-link-284523/locations/us-central1/keyRings/farm-link-secrets/cryptoKeys/db_pass
secretEnv:
DB_PASS: "your encrypted secret here"
- kmsKeyName: projects/farm-link-284523/locations/us-central1/keyRings/farm-link-secrets/cryptoKeys/master_key_string
secretEnv:
MASTER_KEY: "your encrypted secret"
The above will build and deploy the container to Cloud Run automatically, but the production and staging services must be created first. The container image is first created and stored in Container Registry for later use. You can think of Cloud Build as similar to Docker Compose, but it leverages the power of GCP to create builds.
Here is my slim and efficient Dockerfile
used in production:
######################
# Stage: Builder
FROM ruby:2.7.1-alpine as Builder
RUN apk add --update --no-cache \
build-base \
ca-certificates \
curl \
git \
postgresql-dev \
nodejs-current \
yarn \
tzdata
WORKDIR /app
# Install gems
COPY Gemfile* /app/
RUN bundle config set --global frozen 1 \
&& bundle install --without test:development:staging -j4 --retry 3 \
# Remove unneeded files (cached *.gem, *.o, *.c)
&& rm -rf /usr/local/bundle/cache/*.gem \
&& find /usr/local/bundle/gems/ -name "*.c" -delete \
&& find /usr/local/bundle/gems/ -name "*.o" -delete
# Install yarn packages
COPY package.json yarn.lock /app/
RUN yarn install --production
# Add the Rails app
COPY . /app
# Precompile assets
ARG MASTER_KEY
RUN RAILS_ENV=production RAILS_MASTER_KEY=${MASTER_KEY} bundle exec rake assets:precompile
# Remove folders not needed in resulting image
RUN rm -rf node_modules tmp/cache vendor/assets lib/assets spec
###############################
# Stage Runtime
FROM ruby:2.7.1-alpine
# Add Alpine packages
RUN apk add --update --no-cache \
bash \
postgresql-client \
file \
tzdata
# Copy app with gems from former build stage
COPY --from=Builder /usr/local/bundle/ /usr/local/bundle/
COPY --from=Builder /app /app
# Set Rails environment variables
ENV RAILS_ENV=production
ENV RAILS_LOG_TO_STDOUT true
ENV RAILS_SERVE_STATIC_FILES true
ENV PORT 8080
# Set working directory
WORKDIR /app
# Set DB_PASS environment variable from ARG
ARG DB_PASS
ENV DATABASE_PASSWORD=${DB_PASS}
# Set RAILS_MASTER_KEY to config/master.key from ARG
ARG MASTER_KEY
ENV RAILS_MASTER_KEY=${MASTER_KEY}
# Expose Puma port
EXPOSE 8080
# Start up
RUN chmod +x /app/entrypoint.sh
ENTRYPOINT ["/app/entrypoint.sh"]