I want to upload the file to S3
from browser accessing the web application running on Fargate
However, this S3
must have block public access=ON
.
So, It is impossible to upload directly from browser.
Then, I have some alternative ideas,
-
1 Upload files into the container in Fargate and copy them to S3 by lambda.
Q1) Is it possible to access the directory inside the container from lambda?
Q2) How lambda can be triggered?
-
2 Upload files to the EBS and copy them to S3 by lambda
Q1) Is it possible to access EBS from lambda? (maybe yes?)
Q2) Is it possible to trigger the lambda by new files is created in EBS?
Or is there any standard practical method in this case?
Web application system is python django.
Any hint is appreciated.
Thank you very much.
2
Answers
One approach would be to use pre-signed URLs to allow your users to upload files directly to S3 without exposing the bucket. Pre-signed URLs are generated using your AWS credentials and are only valid for a limited period of time. This way, the S3 bucket can have block public access enabled, and your users can still upload files from the browser.
See code (copied from official aws documentation):
Generate a presigned URL that can perform an S3 action for a limited time. Use the Requests package to make a request with the URL.
Generate a presigned POST request to upload a file.
Source: https://docs.aws.amazon.com/AmazonS3/latest/userguide/PresignedUrlUploadObject.html#generating-presigned-url
Not the default ephemeral file system in Fargate, but you could share an EFS volume between Fargate and Lambda.
There’s not a clean way to do this.
Not only is it not possible to access EBS from lambda, but it is also not possible to access EBS from Fargate.
No.
My first question would be why you even want to involve Lambda in the process. Why do you think including Lambda would be helpful here at all?
You could simply upload directly to your Django application running in Fargate, and then have the Python code copy the file to S3 after it receives the upload. There’s even a static files S3 storage backend for Django. This looks like a good tutorial for doing that.
Although it doesn’t sound like you want to serve the files from S3, only upload them to S3, so that tutorial may be more than you need. You probably just need to use boto3 to copy the file to S3 after it is uploaded.
Alternatively, you could use S3 pre-signed URLs, which generates a URL on the server side that can be passed to the browser, which the browser can use to upload directly to S3, even though the bucket has
block public access=ON
.