How-to: Upload Files to Amazon S3 via an API

Abdul R. Wahab
3 min readApr 24, 2022


Amazon S3 is a highly-utilized and well-known object storage service offered by Amazon Web Services (AWS). It is virtually used for everything, from storing data related to application logs, to gigabyte-sized media files.

Manually Upload to S3? Nah, ain’t nobody got time for that… 👎

AWS makes it super easy to utilize S3. You can just create your bucket, and upload whatever data you want into it.

But, what if… instead of manually navigating to our bucket, we upload objects into it through invoking an API? 😃

API to the rescue! 🦸

Having an API to upload data into S3 will enable:

  • LESS manual work 💪
  • Consumer-friendly interaction
  • Quicker process to upload
  • API-reusability

Let’s go build our API!

We will be building our file upload solution using the following AWS Services:

High-Level Solution Architecture

Step 1 — Create an IAM Role with the required policy permissions

This role will consist of a policy that will allow our API to upload files into our S3 Bucket through an API Gateway endpoint.

Make sure your policy looks similar to what I have shared below:

"Version": "2012-10-17",
"Statement": [
"Sid": "VisualEditor0",
"Effect": "Allow",
"Action": "s3:PutObject",
"Resource": "arn:aws:s3:::file-upload-bucket/*"
IAM Role
Policy attached to the IAM Role with Custom Permissions to PUT in S3
API Gateway service included as a Trusted Entity to our IAM Role

Step 2 — Create the API Gateway

Now, we are going to a create an API Gateway that will be the entry to our File Uplaod API.

As shown below, the folder and file resources here represent our S3 Bucket and S3 Object respectively.

These resources will serve as parameters that we will specify as part of the PUT request endpoint/URL by the client.

2a) Create the {folder} resource
2b) Create the {file} child resource under {folder}
2c) Add the /PUT Request under {folder}/{file} with the following settings
2d) Add your URL Path Parameters to match the S3 Bucket and Objects with the API Resources
2e) Include application/csv as a Binary Media Type.

Step 3 — Deploy the API and Call it!

My Postman call to the API Gateway with the bucket and file name

Check out our S3 Bucket:

test.csv file has landed in S3

Want to learn more about Data Engineering? AWS Cloud?

Check out my series listed below! 🙂

Data Engineering:👇

Data + Cloud (Redshift, Data Mesh)

16 stories

AWS Cloud:👇

AWS Cloud

26 stories



A Deep Dive into AWS Data Services

7 min read

May 29

AWS Redshift - Cross-Account Unload & Copy with S3

3 min read

May 26

Implementing a Data Mesh Architecture with AWS Redshift Spectrum and Lake Formation

4 min read

May 7

Using Vault Agent Caching to authenticate to Amazon RDS

3 min read

Apr 21

Amazon Redshift Data Sharing - Underlying Technology

2 min read

May 8

AWS Data Integration across Multiple Data Stores

4 min read

May 7

Authenticating to AWS Redshift using Ephemeral Credentials

3 min read

Apr 29

Lesser Known aspects of Amazon Redshift

4 min read

Apr 28

Amazon Redshift - Views: Simple, Materialized, Late-Binding

4 min read

Apr 27

AWS Redshift Data sharing: Cluster-to-Cluster / Sharing to a Unified Cluster

4 min read

Apr 23

Abdul R. Wahab

Multi-domain Technical Lead specialized in building products users love. Today, I manage & secure big data in the AWS cloud. All views shared are my own.