I have a python method to sends one file that exists in a local folder to S3. I want modify the method to send all files that exist in the folder to S3.
My method is:
import boto3
from boto3 import Session
from botocore.exceptions import NoCredentialsError
session_root = boto3.Session(region_name='eu-west-3', profile_name='my_profile')
s3_client = session_root.client('s3')
prefix_key = "S3_folder/"
local_path = "/home/local_folder/"
bucket_name = "bucket_S3"
def upload_to_aws(local_file, bucket, s3_file):
try:
s3_client.upload_file(local_file, bucket, s3_file)
print("Upload Successful")
return True
except FileNotFoundError:
print("The file was not found")
return False
except NoCredentialsError:
print("Credentials not available")
return False
uploaded_files_to_S3 = upload_to_aws(local_path, bucket_name, prefix_key)
To you have please an idea how can I modify it to take all files in my local_folder
and send them to S3_folder
?
Thanks,
2
Answers
You can achieve this with command
Above command can also be run with python
The
upload_file()
API call can only upload one file at a time.Therefore, you will need to loop through your directory and upload each file individually: