Skip to content

Instantly share code, notes, and snippets.

@fcoclavero
Last active May 2, 2024 23:32
Show Gist options
  • Save fcoclavero/4f42ff23e28e71096b0ccece25770647 to your computer and use it in GitHub Desktop.
Save fcoclavero/4f42ff23e28e71096b0ccece25770647 to your computer and use it in GitHub Desktop.
GCP cheatsheet
# Run the following command:
# `gcloud iam roles create my_project.deployer --project $PROJECT_ID --file role.yaml`
#
# Use `my.custom.role` (dot-separated format) as role ID for consistency.
# See: https://cloud.google.com/iam/docs/reference/rest/v1/projects.roles
# role.yaml
title: My Custom Role
description: This is a description of my custom role.
stage: ALPHA
etag: BwWWja0YfJA=
includedPermissions:
- a.permission.string
# To download GAE source files, first determine service and version
gcloud app versions list
# Now describe. To download, click the `sourceURL` link.
gcloud app versions describe VERSIONID -s SERVICENAME
# Log into SDK as Service Account
gcloud auth activate-service-account sa-name@whatever.iam.gserviceaccount.com --key-file=service_account.json
# Get project number from project ID
gcloud projects describe $PROJECT_ID --format="value(projectNumber)"
# Check if API is enabled on selected project
gcloud services list --enabled --filter="NAME:api-name.googleapis.com"
# Get available IAM Roles
gcloud iam roles list > roles.txt
# Get permissions for role
gcloud iam roles describe $ROLE_NAME
# Get the number of files in a GCS bucket
gsutil du gs://BUCKET_NAME | wc -l
# Get GCS bucket size, in bytes
gsutil du -s gs://BUCKET_NAME
# Create a GCS bucket
gsutil mb -p project-id -c storage-class -l location -b on gs://new-bucket-name
# Upload single file to bucket
gsutil cp /path/to/local/file gs://path/to/gcs/file
# Copy the contents of a GCS bucket
gsutil -m rsync -r gs://origin-bucket gs://destination-bucket
# Upload directory to bucket
gsutil -m rsync -r /path/to/local/dir gs://path/to/gcs/dir
# Upload directory to bucket, excluding any subdirectory matching the name `exclude`
gsutil -m rsync -r -x '.*exclude.*$' /path/to/local/dir gs://path/to/gcs/dir
# Delete bucket
gsutil -m rm -r gs://bucket-name
# Get a project's creator
gcloud logging read --order=asc --limit=1 --format='table(protoPayload.methodName, protoPayload.authenticationInfo.principalEmail)' --project [project-id]
# Generate Terraform files for existing project
gcloud beta resource-config bulk-export --path=local-directory --project=project-id --resource-format=terraform
# Create deployment from YAML
gcloud deployment-manager deployments create example-deployment --config example-deployment.yaml
# Create Artifacts Registry repository
gcloud artifacts repositories create --repository-format DOCKER --description "$REPOSITORY_DESCRIPTION" --location us $REPOSITORY_NAME
# List service account roles
gcloud projects get-iam-policy <YOUR GCLOUD PROJECT> \
--flatten="bindings[].members" \
--format='table(bindings.role)' \
--filter="bindings.members:<YOUR SERVICE ACCOUNT>"
# Local version of gcurl
abbr gcurl='curl -H "$(oauth2l header --json ~/credentials.json cloud-platform userinfo.email)" -H "Content-Type: application/json"'
# Get API Key resource name
curl -H "$(oauth2l header --json ~/credentials.json cloud-platform userinfo.email)" -H "Content-Type: application/json" https://apikeys.googleapis.com/v2/projects/<PROJECT_NUMBER>/locations/global/keys
# Pull GCS buckets from list
while read p; do
gsutil -m cp -R gs://path/to/buckets/$p .
done <bucket_names.txt
# Get file details
# https://stackoverflow.com/a/56367484/4868562
from google.cloud import storage
client = storage.Client()
bucket = client.bucket(<my bucket name>)
blob = bucket.get_blob(<my filename>)
# Available metadata
blob.content_type
blob.content_encoding
blob.content_disposition
blob.content_language
blob.cache_control
blob.metadata # custom metadata dict
# Run the following command:
#
# `gcloud projects set-iam-policy my-project-id policy.yaml`
# policy.yaml
etag: BwWWja0YfJA=
version: 3
bindings:
- members:
- user:mike@example.com
- group:admins@example.com
- domain:google.com
- serviceAccount:my-project-id@appspot.gserviceaccount.com
role: roles/resourcemanager.organizationAdmin
- members:
- user:eve@example.com
role: roles/resourcemanager.organizationViewer
condition:
title: expirable access
description: Does not grant access after Sep 2020
expression: request.time < timestamp('2020-10-01T00:00:00.000Z')
# Concrete example for read access to GCP buckets
- members:
- serviceAccount:642698354275@cloudservices.gserviceaccount.com
role: roles/storage.objectViewer
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment