In this blog, we’re going to cover how you can use the Boto3 AWS SDK (software development kit) to download and upload objects to and from your Amazon S3 buckets. For those of you that aren’t familiar with Boto, it’s the primary Python SDK used to interact with Amazon’s APIs. Using Boto3, you can do everything from accessing objects in S3, creating CloudFront distributions, and creating new VPC security groups. Python is a great language to get started automating things in your cloud environments.
Here are some common things you might want to do with your S3 objects that Boto3 can help with:
If you need to install Python, here’s where you can find the installers. Additionally, PIP sometimes does not come installed with Python, so you’ll need to install that as well. you can install pip fairly easily if you’re running Mac:
sudo easy_install pip
Next, you’ll need to run the following PIP command:
pip install boto3
Here’s how you can go about downloading a file from an Amazon S3 bucket. In the below example, the contents of the downloaded file are printed out to the console:
import boto3 bucket_name = 'my-bucket' s3_file_path= 'directory-in-s3/remote_file.txt' save_as = 'local_file_name.txt' s3 = boto3.client('s3') s3.download_file(bucket_name , s3_file_path, save_as) # Prints out contents of file with open(save_as) as f: print(f.read())
To grant a user, group or role permission to download objects from the bucket, you’ll want to use an IAM or bucket policy. Here’s an example IAM policy to accomplish this; just replace “my-bucket-name” with the name of your S3 bucket:
{ "Version": "2012-10-17", "Statement": [ { "Sid": "AllowBasicS3Upload", "Effect": "Allow", "Action": "s3:GetObject", "Resource": "arn:aws:s3:::my-bucket-name/*" } ] }
Uploading files from the local machine to a target S3 bucket is quite simple. See below for an example:
import boto3 bucket_name = 'my-bucket' content = open('local-file.txt', 'rb') s3 = boto3.client('s3') s3.put_object( Bucket=bucket_name, Key='directory-in-bucket/remote-file.txt', Body=content )
In order for an IAM identity to run the code above, they will need s3:PutObject permissions granted to them for the bucket. See below for an example IAM policy:
{ "Version": "2012-10-17", "Statement": [ { "Sid": "AllowBasicS3Upload", "Effect": "Allow", "Action": "s3:PutObject", "Resource": "arn:aws:s3:::my-bucket-name/*" } ] }
If you’re using Terraform like we are, it might be useful to automatically populate an S3 bucket with certain files when a new environment gets provisioned. Instead of calling a Python script during scenarios involving new infrastructure, one could simply add the following component to their Terraform resources:
resource "aws_s3_bucket_object" "object" { bucket = "your_bucket" key = "s3/path/destination-file.txt" source = "localpath/source-file.txt" etag = "${md5(file("localpath/source-file.txt"))}" }
For reference, you can find the Boto3 S3 SDK here.
If you’d like quick access to the Boto3 documentation and your AWS resources, check out our Clouductivity Navigator product. We help put the documentation you need right at your fingertips.
I work at a staffing agency. Day in and day out, I get to hear our recruiters prep candidates as…
Getting into DevOps is like any other career, in the sense that it is a life-altering decision - and, with…
If you've managed PostgreSQL databases in the past, you know how important it is to have proper monitoring in place…
Not too long ago, I was searching for information on the AWS Solutions Architect Professional certification exam. I was trying…
So you've launched a new application in production, and your applications are performing snappily as you might expect. The cloud…
It's Not Easy Doing Everything There's an oft-circulated image in the technology sector - a company looking for a Swift…