skip to Main Content

I have multiple files uploaded in S3 bucket and I want to print the names of the files in the output in a list through lamda python .

Lets say I have files S1,S2,S3 … and so on and I want the output as
[‘S1′,’S2′,’S3’..] for all the files uploaded in S3.

I have tried code like below but not able to print the output as expected :-

import boto3
import json
from datetime import datetime as dt

client = boto3.resource('s3')
s3_client = boto3.client('s3')
def lambda_handler(event, context):
    paginator = s3_client.get_paginator('list_objects')

    # Create a PageIterator from the Paginator
    page_iterator = paginator.paginate(Bucket='mylocals3bucketpoc')
    for page in page_iterator:
    vars = []
        for Contents in page['Contents']:
            print(Contents['Key'])
      

The output :-

S1
S2
S3

as I want to have the output like ['S1','S2','S3'] ,I am trying to use list append but not able to get desired output.
Please help me to resolve the same.

2

Answers


  1. You can create a list and append your names to it.

    import boto3
    import json
    from datetime import datetime as dt
    
    client = boto3.resource('s3')
    s3_client = boto3.client('s3')
    
    def lambda_handler(event, context):
        paginator = s3_client.get_paginator('list_objects')
        page_iterator = paginator.paginate(Bucket='mylocals3bucketpoc')
        file_list = []
    
        for page in page_iterator:
            for Contents in page['Contents']:
                file_list.append(Contents['Key'])
        print(file_list)
    

    Or simple list comperehension.

    file_list = [content['Key'] for page in page_iterator for content in page['Contents']]
    
    Login or Signup to reply.
  2. To achieve this in an AWS Lambda function, you can use the following code. This code lists all objects in the specified S3 bucket, extracts the filenames, and prints them in a list format.

    Make sure you have the appropriate IAM role permissions for your Lambda function to access the S3 bucket.
    Use the boto3 library to interact with S3.
    Here’s the code for your Lambda function:

    import boto3
    

    def lambda_handler(event, context):
    # Initialize the S3 client
    s3_client = boto3.client(‘s3’)

    # Specify your bucket name
    bucket_name = 'your-bucket-name'
    
    # Initialize an empty list to hold the filenames
    filenames = []
    
    # List objects in the specified S3 bucket
    response = s3_client.list_objects_v2(Bucket=bucket_name)
    
    # Check if the bucket has any objects
    if 'Contents' in response:
        for obj in response['Contents']:
            # Append each filename (key) to the list
            filenames.append(obj['Key'])
    
    # Print the filenames in the Lambda logs
    print(filenames)
    
    # Return the filenames list as the function's response
    return filenames
    

    Explanation
    Initialize the S3 Client: The Lambda function initializes an S3 client to communicate with S3.
    List Objects: The list_objects_v2 method is called to retrieve the list of objects in the specified bucket.
    Extract and Append Filenames: For each object, the filename (i.e., the object Key) is appended to the filenames list.
    Print and Return: The filenames list is printed (for viewing in Lambda logs) and returned as the function’s output.
    Important Points
    IAM Permissions: Ensure the Lambda function’s execution role has s3:ListBucket permissions for the bucket.
    Environment Setup: Replace ‘your-bucket-name’ with the actual name of your S3 bucket.
    With this setup, when you upload files like S1, S2, S3 to the bucket, invoking this Lambda function will output:

    [‘S1’, ‘S2’, ‘S3’, …]

    Know more details-
    https://reactnativelab.blogspot.com/

    Login or Signup to reply.
Please signup or login to give your own answer.
Back To Top
Search