Skip to content

Instantly share code, notes, and snippets.

@gitfvb
Last active September 20, 2024 10:17
Show Gist options
  • Save gitfvb/e6024c4d92400d2278b0b5c128b35e16 to your computer and use it in GitHub Desktop.
Save gitfvb/e6024c4d92400d2278b0b5c128b35e16 to your computer and use it in GitHub Desktop.
Writing files via PowerShell to AWS S3 buckets

PowerShell

This works for PowerShell >= 5.1

You need to install the AWS tools beforehand with

Install-Module -Name AWS.Tools.Installer
Install-AWSToolsModule S3

Then create a profile for the credentials

Set-AWSCredential -AccessKey "abc" -SecretKey "def" -StoreAs "s3test"

Now you can create a regular script to upload files

Write-S3Object -BucketName "apteco-cloud-customer" -File .\test.txt -ProfileName s3test

To see the list of files you have created, just use this command

# List all files/objects in bucket
Get-S3Object -ProfileName s3test -BucketName "apteco-cloud-customer"

Python

This needs some packages to be installed, e.g. with pip from PowerShell or Bash

pip install boto3

Then you can directly jump into your code. This example directly enters the credentials. To see more secure possibilites, please have a look at https://boto3.amazonaws.com/v1/documentation/api/latest/guide/credentials.html

import boto3
boto3.client('s3',aws_access_key_id='abc',aws_secret_access_key='def')
s3_client.upload_file(Filename='./hw.txt',Bucket='apteco-cloud-client',Key='hw.txt')

To see all uploaded files, simply output this dict

s3_client.list_objects(Bucket='apteco-cloud-srk')['Contents']
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment