It can be difficult to upload large files to S3 using traditional single application methods. If you transfer a 5GB database backup, and interfere with the network, it forces you to resume the entire upload process. It waste bandout and time. And as the file size increases, this approach becomes unreliable.
With the same pits operation, you can actually upload an item up to 5GB. But, when it comes to uploading large items (above 5GB) using Amazon S3. Multi Part Upload The feature is a better view.
Multi -Part Upload makes it easy for you to upload large files and items in small, independent parts that upload and re -assemble on the S3.
In this guide, you will learn how to enforce multi -part uploads using AWS CLI.
The table of content
Provisions
You should be following this guide, you should be:
How Multi Part uploads
In a multi -part upload, large file transfer is divided into smaller parts that are uploaded separately to Amazon S3. After completing all the classes of their uploads, the S3 re -connects them to the full item.
For example, 160 GB file broke out in 1GB classes produces 160 individual upload operations on S3. Each section receives a separate identity by saving the setting information to guarantee the appropriate file reconstruction.
This system supports the re -attempting logic for failing classes and allows to upload suspension/maintenance functionality. Here is an arigram that shows how the multi -part upload process looks like:
Resume
Before starting with this guide, make sure that your machine has AWS CLI installed. If you do not already have installed, follow the steps below.
Step 1: Download AWS CLI
To download CLI, see CLI Download Documents. After that, download the CLI based on your operating system (Windows, Linux, Macos). Once the CLI is installed, the next step is to create your AWS IAM credentials in your terminal.
Step 2: Create AWS IAM credentials
To create your AWS credentials, go to your terminal and run down the command:
aws configure
This command indicates you to paste some credentials, such as the AWS Access key ID and the secret ID. Create a new IAM user in your AWS account to obtain these certificates. To do this, follow the steps below. (If you already have any user and security credentials, you can leave these steps.)
Sign In Your AWS Dashboard.
Click on the search bar on top of your dashboard and find “IAM”.
Click IAM.
In the left navigation pan, navigate Management management > Users.
Click Create a user.
IAM during the user creation, connect a policy directly by selecting Attach policies directly In Step 2: Set the permit.
Give access to the user by searching the “Admin” in the search bar and selecting the Administrator Exxus in the Permissions policies.
On the next page, click Create a user.
Click the user created Users On section and navigate Security credentials.
Scroll down and click Make the access key.
Select the Command Line Interface (CLI) use case.
On the next page, click Make the access key.
Now you will see your access keys. Please keep them safe And do not expose them publicly or share them with anyone.
After running you can now copy these access keys to your terminal aws configure
Order.
You will be indicated to add the following details:
AWS Access Key ID
: Get IAM created from user credentials. See the above stages.AWS Secret Access Key
: Get IAM created from user credentials. See the above stages.Default region name
: Name of default AWS region, for example,us-east-1
.Default output format
: Nobody
Now we have worked with CLI configuration.
To confirm that you have successfully installed CLI, run the command below:
aws --version
You should see the CLI version in your terminal as shown below:
Now, you are ready for the following important steps for multi -part uploads 🙂
Step 1: Distribution Object
The first step is to divide the item you want to upload. This guide l We will, we will divide 188MB Video file in small sections.
Note that this process also works for huge files.
Next, find out the item you plan to upload to your system. You can use cd
Using your terminal order to find the object in the stored folder.
Then run down the splat command:
split -b mb
Change
In the megabytes with the size of your desired part (for example, 150, 100, 200).
We will divide our 188MB video file into bytes for this use issue. The command is:
split -b 31457280 videoplayback.mp4
Next, drive ls -lh
Command on your terminal. You should get down the output:
Here, you can see that the 188MB file is divided into multiple parts (30MB and 7.9MB). When you go to the folder where the object is stored in your system files, you will see additional files with names that look like:
And so these files represent different parts of your object. For example, xaa
The first part of your file, which will be uploaded to the first S3. More on it later on the guide.
Step 2: Make an Amazon S3 bucket
If you do not already have a S3 bucket, follow the steps contained in AWS Start with Amazon S3 Documents to make one.
Step 3: Start Multi Part Upload
The next step is to start uploading multi -part. To do this, follow the command below:
aws s3api create-multipart-upload --bucket DOC-EXAMPLE-BUCKET --key large_test_file
In this order:
DOC-EXAMPLE-BUCKET
Is your S3 bucket name?large_test_file
The file name is, for example, video playback dot MP4.
You will get the JSON answer in your terminal, give you the Lord UploadId
. The answer looks like this:
{
"ServerSideEncryption": "AES345",
"Bucket": "s3-multipart-uploads",
"Key": "videoplayback.mp4",
"UploadId": "************************************"
}
Keep UploadId
Somewhere in your local machine, you will need later steps.
Step 4: Upload the distribution files in the S3 bucket
Remember the additional files that are saved as XAA, XAB, and so on? Well, now is the time to upload them to your S3 bucket. To do this, follow the command below:
aws s3api upload-part --bucket DOC-EXAMPLE-BUCKET --key large_test_file --part-number 1 --body large_test_file.001 --upload-id exampleTUVGeKAk3Ob7qMynRKqe3ROcavPRwg92eA6JPD4ybIGRxJx9R0VbgkrnOVphZFK59KCYJAO1PXlrBSW7vcH7ANHZwTTf0ovqe6XPYHwsSp7eTRnXB1qjx40Tk
DOC-EXAMPLE-BUCKET
Is your S3 bucket name?large_test_file
File name is, for example, video playback dot mp4large_test_file.001
The name of the file is the name, for example, XAA.upload-id
Eg the ID changes with its own secureUploadId
.
Command returns a response to which one happens Atg The value of the file you have uploaded.
{
"ServerSideEncryption": "aws:kms",
"ETag": "\"7f9b8c3e2a1d5f4e8c9b2a6d4e8f1c3a\"",
"ChecksumCRC64NVME": "mK9xQpD2WnE="
}
Copy Atg Appreciate and save your local machine, as you will need it later as a reference.
Repeat the above command and upload the remaining file parts, add both part number and file name for each upload. For example: xaa
Becomes xab
And --part-number 1
Becomes --part-number 2
And so on
Note that the uploaded speed depends on how big the object is and how good your Internet speed is.
To confirm that all parts of the file have been successfully uploaded, run the command below:
aws s3api list-parts --bucket s3-multipart-uploads --key videoplayback.mp4 --upload-id p0NU3agC3C2tOi4oBmT8lHLebUYqYXmWhEYYt8gc8jXlCStEZYe1_kSx1GjON2ExY_0T.4N4E6pjzPlNcji7VDT6UomtNYUhFkyzpQ7IFKrtA5Dov8YdC20c7UE20Qf0
Change the upload ID for example with your original upload ID.
You should answer JSON like this:
{
"Parts": (
{
"PartNumber": 1,
"LastModified": "2025-07-27T14:22:18+00:00",
"ETag": "\"f7b9c8e4d3a2f6e8c9b5a4d7e6f8c2b1\"",
"Size": 26214400
},
{
"PartNumber": 2,
"LastModified": "2025-07-27T14:25:42+00:00",
"ETag": "\"a8e5d2c7f9b4e6a3c8d5f2e9b7c4a6d3\"",
"Size": 26214400
},
{
"PartNumber": 3,
"LastModified": "2025-07-27T14:28:15+00:00",
"ETag": "\"c4f8e2b6d9a3c7e5f8b2d6a9c3e7f4b8\"",
"Size": 26214400
},
{
"PartNumber": 4,
"LastModified": "2025-07-27T14:31:03+00:00",
"ETag": "\"e9c3f7a5d8b4e6c9f2a7d4b8c6e3f9a2\"",
"Size": 26214400
},
{
"PartNumber": 5,
"LastModified": "2025-07-27T14:33:47+00:00",
"ETag": "\"b6d4a8c7f5e9b3d6a2c8f4e7b9c5d8a6\"",
"Size": 26214400
},
{
"PartNumber": 6,
"LastModified": "2025-07-27T14:36:29+00:00",
"ETag": "\"d7e3c9f6a4b8d2e5c7f9a3b6d4e8c2f5\"",
"Size": 26214400
},
{
"PartNumber": 7,
"LastModified": "2025-07-27T14:38:52+00:00",
"ETag": "\"f2a6d8c4e7b3f6a9c2d5e8b4c7f3a6d9\"",
"Size": 15728640
}
)
}
That way you confirm that all parts are uploaded.
Step 5: Create a JSON file to compile ETAG values
The document we are about to make helps to understand AWS which parts represent. Collect Atg Organize values from each uploaded file part and in the JSON structure.
Sample JSON Format:
{
"Parts": ({
"ETag": "example8be9a0268ebfb8b115d4c1fd3",
"PartNumber":1
},
....
{
"ETag": "example246e31ab807da6f62802c1ae8",
"PartNumber":4
})
}
Save the Created JSON file in the same folder as your Object and its Name multipart.json
. You can use any IDE of your choice to make and save this document.
Step 6: Upload full multipart on S3
To complete the multi -part upload, run the command below:
aws s3api complete-multipart-upload --multipart-upload file://fileparts.json --bucket DOC-EXAMPLE-BUCKET --key large_test_file --upload-id exampleTUVGeKAk3Ob7qMynRKqe3ROcavPRwg92eA6JPD4ybIGRxJx9R0VbgkrnOVphZFK59KCYJAO1PXlrBSW7vcH7ANHZwTTf0ovqe6XPYHwsSp7eTRnXB1qjx40Tk
Change fileparts.json
With multipart.json
.
You should get the same output like:
{
"ServerSideEncryption": "AES256",
"Location": "https://s3-multipart-uploads.s3.eu-west-1.amazonaws.com/videoplayback.mp4",
"Bucket": "s3-multipart-uploads",
"Key": "videoplayback.mp4",
"ETag": "\"78298db673a369adf33dd8054bb6bab7-7\"",
"ChecksumCRC64NVME": "d1UPkm73mAE=",
"ChecksumType": "FULL_OBJECT"
}
Now, when you go to your S3 bucket and hit the refresh, you should see the uploaded object.
Here, you can see the full file, file name, type and size.
Conclusion
Multi -part uploads transform large file transfers into Amazon S3 into a delicate, all -or -nothing operations, re -acting processes. By distributing files into manageable parts, you have the ability to recover the ability, improve performance, and the S3 5GB of Single Upload limit.
This approach is essential for the productive environment dealing with database backups, video files, or any major assets. With the AWS CLI techniques included in this guide, now you are equipped with confidently handling the S3 transfer, regardless of file size or network terms.
Check this AWS Knowledge Center documents More information about multi -part uploads using AWS CLI.