I want to upload my logs to my bucket
I never been used python and boto3
This is my code
import os
import datetime as dt
import boto3
x = dt.datetime.now()
date = x.strftime("%Y%m%d")
bucket = 'mybucket'
dir_path = "/log"
s3 = boto3.client('s3')
def log():
global dir_path
for (dir_path, dir, files) in os.walk(dir_path):
for file in files:
if date in file:
file_path = os.path.join(dir_path, file)
print file_path
file_name = (log())
key = (log())
res = s3.upoad_file(file_name, bucket, key)
and this is result
log1
log2
log3
log4
Traceback *most recent call last):
File "test2.py", line 21, in <module>
res = s3.upload_file(file_name, bucket, key)
File "home/user/.local/lib/python2.7/site-packages/boto3/s3/tranfer.py", line 273, in upload_file extra_args=ExtraArgs, callback=Callback)
File "home/user/.local/lib/python2.7/site-packages/boto3/s3/tranfer.py", line 273, in upload_file raise ValueError('Filename must be a string')
ValueError: Filename must be a string
I have 4 log files
please help me
how to fix it?
2
Answers
It was simple. this is my final code thanks.
Since you need to upload more than one file, and you
stated that the
upload one log
works, you coulddo the following, which basically goes through the
directory list as per your original intention, and
then for each file item that satisfies that criteria
(
date in file
), it returns the filepath to thecalling loop.
Please note that if you need to keep track of the results,
then you could make a change like so: