We believe that the S3 checksum is useless for huge files larger than 1GB, as it is a further hashed value of chunks separated by an arbitrary number of bytes.
There is a 1 GB file uploaded to AWS S3.
The SHA256 checksum value is "o9mK1Ay32kIpvW157S40b/2siazR/+tpuz6OYCsjNBU=-2620".
Is there any way to verify that the file locally and the file uploaded to S3 are identical in content?
Without downloading of course.
I am hoping to use the AWS SDK or cli to calculate the same checksum value from the local file
2
Answers
I was able to get all hash values per chunk with the get-object-attributes command. The byte counts are explicitly listed, so I assume they are complete.
From Checking object integrity – Amazon Simple Storage Service: