Skip to content

Instantly share code, notes, and snippets.

@snmmaurya
Last active September 26, 2019 20:31
Show Gist options
  • Save snmmaurya/e67d3d050cffe8319f7ec72cdb3be6f2 to your computer and use it in GitHub Desktop.
Save snmmaurya/e67d3d050cffe8319f7ec72cdb3be6f2 to your computer and use it in GitHub Desktop.
Bitbucket pipeline for multiple AWS account (rails app with rspec)

Setup bitbucket pipeline to deploy rails app on beanstalk with multiple AWS account.

My application based on-

Postgres
Redis
Rspec

I am using two environment for my rails application

  1. staging
  2. production

I have 2 AWS account to deploy -

A) staging
B) production

Steps -

Add your AWS credentials - Go to bitbucket settings choose repository variables and required env variables

For AWS account A

STAGING_AWS_DEFAULT_REGION
STAGING_AWS_ACCESS_KEY_ID
STAGING_AWS_SECRET_ACCESS_KEY

For AWS account B

PRODUCTION_AWS_DEFAULT_REGION
PRODUCTION_AWS_ACCESS_KEY_ID
PRODUCTION_AWS_SECRET_ACCESS_KEY

I am using 2 branch to deloy -

B1) staging
B2) master

So far my bitbucket-pipelines.yml

image: ruby:2.5.3

pipelines:
  branches:
    staging:
      - step:
          name: "Build and Test"
          caches:
            - bundler
            - pip
          script:
            - bundle install --path vendor/bundle
            - rake db:create
            - rake db:migrate
            - rake db:test:prepare
            - bundle exec rspec
          services:
            - postgres
            - redis

      - step:
          name: "Deploy on Staging"
          caches:
            - pip
          script:
            - export AWS_DEFAULT_REGION=$STAGING_AWS_DEFAULT_REGION
            - export AWS_ACCESS_KEY_ID=$STAGING_AWS_ACCESS_KEY_ID
            - export AWS_SECRET_ACCESS_KEY=$STAGING_AWS_SECRET_ACCESS_KEY
            - apt-get update && apt-get install -y python-dev
            - curl -O https://bootstrap.pypa.io/get-pip.py
            - python get-pip.py
            - pip install awsebcli --upgrade
            - pip install awscli --upgrade
            - aws --version
            - eb deploy enviromentname

    master:
      - step:
          name: "Build and Test"
          caches:
            - bundler
            - pip
          script:
            - bundle install --path vendor/bundle
            - rake db:create
            - rake db:migrate
            - rake db:test:prepare
            - bundle exec rspec
          services:
            - postgres
            - redis

      - step:
          name: "Deploy on Master"
          caches:
            - pip
          script:
            - export AWS_DEFAULT_REGION=$PRODUCTION_AWS_DEFAULT_REGION
            - export AWS_ACCESS_KEY_ID=$PRODUCTION_AWS_ACCESS_KEY_ID
            - export AWS_SECRET_ACCESS_KEY=$PRODUCTION_AWS_SECRET_ACCESS_KEY
            - apt-get update && apt-get install -y python-dev
            - curl -O https://bootstrap.pypa.io/get-pip.py
            - python get-pip.py
            - pip install awsebcli --upgrade
            - pip install awscli --upgrade
            - aws --version
            - eb deploy enviromentname


definitions:
  caches:
    bundler: vendor/bundle
    pip: ~/.cache/pip
  services:
    postgres:
      image: postgres
      environment:
        POSTGRES_DB: 'username'
        POSTGRES_USER: 'root'
        POSTGRES_PASSWORD: 'password'
    redis: 
      image: redis

Parallel Deployment on multiple environments

By default bitbucket run each steps in sync, but sometimes we want to deploy multiple environments at the same time means parallel, you can achieve this job using 'parallel:'

pipelines:
  branches:
    master:
      parallel:
        - step:
            name: "Deploy on Master Environment 1"
            caches:
              - pip
            script:
              - export AWS_DEFAULT_REGION=$PRODUCTION_AWS_DEFAULT_REGION
              - export AWS_ACCESS_KEY_ID=$PRODUCTION_AWS_ACCESS_KEY_ID
              - export AWS_SECRET_ACCESS_KEY=$PRODUCTION_AWS_SECRET_ACCESS_KEY
              - apt-get update && apt-get install -y python-dev
              - curl -O https://bootstrap.pypa.io/get-pip.py
              - python get-pip.py
              - pip install awsebcli --upgrade
              - pip install awscli --upgrade
              - aws --version
              - eb deploy enviromentname1
         
         - step:
            name: "Deploy on Master Environment 2"
            caches:
              - pip
            script:
              - export AWS_DEFAULT_REGION=$PRODUCTION_AWS_DEFAULT_REGION
              - export AWS_ACCESS_KEY_ID=$PRODUCTION_AWS_ACCESS_KEY_ID
              - export AWS_SECRET_ACCESS_KEY=$PRODUCTION_AWS_SECRET_ACCESS_KEY
              - apt-get update && apt-get install -y python-dev
              - curl -O https://bootstrap.pypa.io/get-pip.py
              - python get-pip.py
              - pip install awsebcli --upgrade
              - pip install awscli --upgrade
              - aws --version
              - eb deploy enviromentname2

Manul trigger to deploy

If you want run a step manually you can specify trigger: mannual, for ex you have 2 steps - first one is to run rspec and last one is to deploy and you want step to run spec should auto run but deployment should be run when I manually trigger it.

pipelines:
  branches:
    master:
      - step:
          name: "Deploy on Master"
          trigger: manual
          caches:
            - pip
          script:
            - export AWS_DEFAULT_REGION=$PRODUCTION_AWS_DEFAULT_REGION
            - export AWS_ACCESS_KEY_ID=$PRODUCTION_AWS_ACCESS_KEY_ID
            - export AWS_SECRET_ACCESS_KEY=$PRODUCTION_AWS_SECRET_ACCESS_KEY
            - apt-get update && apt-get install -y python-dev
            - curl -O https://bootstrap.pypa.io/get-pip.py
            - python get-pip.py
            - pip install awsebcli --upgrade
            - pip install awscli --upgrade
            - aws --version
            - eb deploy enviromentname
    
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment