Postgres
Redis
Rspec
- staging
- production
A) staging
B) production
Add your AWS credentials - Go to bitbucket settings choose repository variables and required env variables
STAGING_AWS_DEFAULT_REGION
STAGING_AWS_ACCESS_KEY_ID
STAGING_AWS_SECRET_ACCESS_KEY
PRODUCTION_AWS_DEFAULT_REGION
PRODUCTION_AWS_ACCESS_KEY_ID
PRODUCTION_AWS_SECRET_ACCESS_KEY
B1) staging
B2) master
image: ruby:2.5.3
pipelines:
branches:
staging:
- step:
name: "Build and Test"
caches:
- bundler
- pip
script:
- bundle install --path vendor/bundle
- rake db:create
- rake db:migrate
- rake db:test:prepare
- bundle exec rspec
services:
- postgres
- redis
- step:
name: "Deploy on Staging"
caches:
- pip
script:
- export AWS_DEFAULT_REGION=$STAGING_AWS_DEFAULT_REGION
- export AWS_ACCESS_KEY_ID=$STAGING_AWS_ACCESS_KEY_ID
- export AWS_SECRET_ACCESS_KEY=$STAGING_AWS_SECRET_ACCESS_KEY
- apt-get update && apt-get install -y python-dev
- curl -O https://bootstrap.pypa.io/get-pip.py
- python get-pip.py
- pip install awsebcli --upgrade
- pip install awscli --upgrade
- aws --version
- eb deploy enviromentname
master:
- step:
name: "Build and Test"
caches:
- bundler
- pip
script:
- bundle install --path vendor/bundle
- rake db:create
- rake db:migrate
- rake db:test:prepare
- bundle exec rspec
services:
- postgres
- redis
- step:
name: "Deploy on Master"
caches:
- pip
script:
- export AWS_DEFAULT_REGION=$PRODUCTION_AWS_DEFAULT_REGION
- export AWS_ACCESS_KEY_ID=$PRODUCTION_AWS_ACCESS_KEY_ID
- export AWS_SECRET_ACCESS_KEY=$PRODUCTION_AWS_SECRET_ACCESS_KEY
- apt-get update && apt-get install -y python-dev
- curl -O https://bootstrap.pypa.io/get-pip.py
- python get-pip.py
- pip install awsebcli --upgrade
- pip install awscli --upgrade
- aws --version
- eb deploy enviromentname
definitions:
caches:
bundler: vendor/bundle
pip: ~/.cache/pip
services:
postgres:
image: postgres
environment:
POSTGRES_DB: 'username'
POSTGRES_USER: 'root'
POSTGRES_PASSWORD: 'password'
redis:
image: redis
By default bitbucket run each steps in sync, but sometimes we want to deploy multiple environments at the same time means parallel, you can achieve this job using 'parallel:'
pipelines:
branches:
master:
parallel:
- step:
name: "Deploy on Master Environment 1"
caches:
- pip
script:
- export AWS_DEFAULT_REGION=$PRODUCTION_AWS_DEFAULT_REGION
- export AWS_ACCESS_KEY_ID=$PRODUCTION_AWS_ACCESS_KEY_ID
- export AWS_SECRET_ACCESS_KEY=$PRODUCTION_AWS_SECRET_ACCESS_KEY
- apt-get update && apt-get install -y python-dev
- curl -O https://bootstrap.pypa.io/get-pip.py
- python get-pip.py
- pip install awsebcli --upgrade
- pip install awscli --upgrade
- aws --version
- eb deploy enviromentname1
- step:
name: "Deploy on Master Environment 2"
caches:
- pip
script:
- export AWS_DEFAULT_REGION=$PRODUCTION_AWS_DEFAULT_REGION
- export AWS_ACCESS_KEY_ID=$PRODUCTION_AWS_ACCESS_KEY_ID
- export AWS_SECRET_ACCESS_KEY=$PRODUCTION_AWS_SECRET_ACCESS_KEY
- apt-get update && apt-get install -y python-dev
- curl -O https://bootstrap.pypa.io/get-pip.py
- python get-pip.py
- pip install awsebcli --upgrade
- pip install awscli --upgrade
- aws --version
- eb deploy enviromentname2
If you want run a step manually you can specify trigger: mannual, for ex you have 2 steps - first one is to run rspec and last one is to deploy and you want step to run spec should auto run but deployment should be run when I manually trigger it.
pipelines:
branches:
master:
- step:
name: "Deploy on Master"
trigger: manual
caches:
- pip
script:
- export AWS_DEFAULT_REGION=$PRODUCTION_AWS_DEFAULT_REGION
- export AWS_ACCESS_KEY_ID=$PRODUCTION_AWS_ACCESS_KEY_ID
- export AWS_SECRET_ACCESS_KEY=$PRODUCTION_AWS_SECRET_ACCESS_KEY
- apt-get update && apt-get install -y python-dev
- curl -O https://bootstrap.pypa.io/get-pip.py
- python get-pip.py
- pip install awsebcli --upgrade
- pip install awscli --upgrade
- aws --version
- eb deploy enviromentname