Here is my code for uploading the image to AWS S3:
@app.post("/post_ads")
async def create_upload_files(files: list[UploadFile] = File(description="Multiple files as UploadFile")):
main_image_list = []
for file in files:
s3 = boto3.resource(
's3',
aws_access_key_id = aws_access_key_id,
aws_secret_access_key = aws_secret_access_key
)
bucket = s3.Bucket(aws_bucket_name)
bucket.upload_fileobj(file.file,file.filename,ExtraArgs={"ACL":"public-read"})
Is there any way to compress the image size and upload the image to a specific folder using boto3
? I have this function for compressing the image, but I don’t know how to integrate it into boto3.
for file in files:
im = Image.open(file.file)
im = im.convert("RGB")
im_io = BytesIO()
im = im.save(im_io, 'JPEG', quality=50)
s3 = boto3.resource(
's3',
aws_access_key_id = aws_access_key_id,
aws_secret_access_key = aws_secret_access_key
)
bucket = s3.Bucket(aws_bucket_name)
bucket.upload_fileobj(file.file,file.filename,ExtraArgs={"ACL":"public-read"})
Update #1
After following Chris’s recommendation, my problem has been resolved:
Here is Chris’s solution:
im_io.seek(0)
bucket.upload_fileobj(im_io,file.filename,ExtraArgs={"ACL":"public-read"})
2
Answers
You seem to be saving the image bytes to a
BytesIO
stream, which is never used, as you upload the original file object to the s3 bucket instead, as shown in this line of your code:Hence, you need to pass the
BytesIO
object toupload_fileobj()
function, and make sure to call.seek(0)
before that, in order to rewind the cursor (or "file pointer") to the start of the buffer. The reason for calling.seek(0)
is thatim.save()
method uses the cursor to iterate through the buffer, and when it reaches the end, it does not reset the cursor to the beginning. Hence, any future read operations would start at the end of the buffer. The same applies to reading from the original file, as described in this answer—you would need to callfile.file.seek(0)
, if the file contents were read already and you needed to read from thefile
again.Example on how to load the image into
BytesIO
stream and use it to upload the file/image can be seen below. Please remember to properlyclose
theUploadFile
,Image
andBytesIO
objects, in order to release their memory (see related answer as well).As for the URL, using
ExtraArgs={"ACL":"public-read"}
should work as expected and make your resource (file) publicly accessible. Hence, please make sure you are accessing the correct URL.aws s3 sync s3://your-pics. for file in "$ (find. -name "*.jpg")"; do gzip "$file"; echo "$file"; done aws s3 sync. s3://your-pics –content-encoding gzip –dryrun This will download all files in s3 bucket to the machine (or ec2 instance), compresses the image files and upload them back to s3 bucket.
This should help you.