skip to Main Content

How to configure AWS S3 bucket policies so that uploaded files are readable as public files.
Could somebody please give an example for node js code?

I am a novice to the field so I do not have prior experience regarding this. Any help is highly appreciated.

3

Answers


  1. You can check out this official documentation of AWS s3 for NodeJS that explains how to update bucket policies

    Login or Signup to reply.
  2. There are several steps in this process.This instructions are applicable for nodejs 14 (runtime: nodejs14.x)
    First you have to follow the below link format after logging in to your AWS account
    You should follow the below format.
    https://s3.console.aws.amazon.com/s3/buckets/{BUCKET-NAME}?region={REGION}&tab=permissions#

    This is an example link

    https://s3.console.aws.amazon.com/s3/buckets/logo?region=us-east-1&tab=permissions#

    image: This is an example of entering the path

    Second step is creating bucket policies. Replace the word "BUCKET-NAME" with the name of your bucket.

    {
    "Version": "2012-10-17",
    "Statement": [
        {
            "Sid": "AllowPublicRead",
            "Effect": "Allow",
            "Principal": "*",
            "Action": "s3:PutObject",
            "Resource": "arn:aws:s3:::BUCKET-NAME/*",
            "Condition": {
                "StringEquals": {
                    "s3:x-amz-acl": "public-read"
                }
            }
        }
    ]
    

    }

    image: Bucket policies JSON configuration

    Then create an endpoint on serverless.yml following the below code. file_upload function is included in handler.js file.

    serverless.yml

    file_upload:
        handler: handler.file_upload
        timeout: 15
        events:
          - httpApi:
              path: /file-upload
              method: post
    

    After that you have to program handler.js file as below

    handler.js

    'use strict';
    const AWS = require("aws-sdk");
    const s3 = new AWS.S3()
    const { Validator } = require('node-input-validator');
    
    const MAX_SIZE = 2097152 // 2MB
    const bucket = 'S3_BUCKET-NAME' // Name of your bucket.
    const Busboy = require("busboy")
    
    s3.config.update({
      region: "us-east-1",
      accessKeyId: 'S3_ACCESS_KEY_ID',
      secretAccessKey: 'S3_SECRET_ACCESS_KEY'
    });
    
    const sendJSON = (code, message, data) => {
        let resData = {
            "status": code < 400 ? 'Success' : 'Error',
            "message": message,
        }
        data ? (resData["data"] = data) : null;
        return {
            statusCode: code,
            headers: {
                "Content-Type": "application/json"
            },
            body: JSON.stringify({
                ...resData
            })
        };
    }
    
    const FORM = {
        parse(body, headers) {
            return new Promise((resolve, reject) => {
                const data = {};
                const buffer = Buffer.from(body, 'base64');
                const bb = Busboy({
                    headers: Object.keys(headers).reduce((newHeaders, key) => {
                        // busboy expects lower-case headers.
                        newHeaders[key.toLowerCase()] = headers[key];
                        return newHeaders;
                    }, {}),
                    limits: {
                        fileSize: MAX_SIZE, // Set as desired.
                        files: 1,
                    },
                });
    
                bb.on('file', (name, stream, info) => {
                    const chunks = [];
    
                    stream.on('data', (chunk) => {
                        if (name === 'File') {
                            chunks.push(chunk);
                        } else {
                            reject(new Error('File not found.'));
                        }
                    }).on('limit', () => {
                        reject(new Error('File size limit has been reached.'));
                    }).on('close', () => {
                        if (name === 'File') {
                            data[name] = Buffer.concat(chunks);
                            data['ContentType'] = info.mimeType;
                            data['FileName'] = info.filename;
                        }
                    });
                });
                bb.on('field', (name, val, info) => {
                    data[name] = val;
                });
                bb.on('error', (err) => {
                    reject(err);
                });
                bb.on('close', () => {
                    resolve(data);
                });
    
                bb.end(buffer);
            });
        }
    };
    const uploadToS3 = (bucket, key, buffer, mimeType) =>
        new Promise((resolve, reject) => {
            s3.upload(
                { Bucket: bucket, Key: key, Body: buffer, ContentType: mimeType, ACL: 'public-read' },
                function (err, data) {
                    if (err) reject(err);
                    resolve(data)
                })
        });
    
    module.exports.file_upload = async (event) => {
      try {
            const data = await FORM.parse(event['body'], event['headers']);
            const validations = new Validator(data, {
                File: 'required'
            });
            const.path = data.path? data.path : null;
            const matched = await validations.check();
            if (!matched) {
                return sendJSON(400, validations.errors);
            }
            const date = Math.floor(Date.now() / 1000);
            const list = data.FileName.split(".");
            const originalKey = `${PATH}/${Date.now()}_${md5(list[0])}.${list[list.length-1]}`; // "PATH" is your sub-folder path in S3.
            const originalFile = await Promise.all([
                uploadToS3(bucket, originalKey, data.File, data.ContentType)
            ]);
            const file_name = originalFile[0]['Key'];
            return sendJSON(201, 'Successfully saved.', originalFile);
      } catch (e) {
        return sendJSON(400, e.message);
      }
    };
    

    A link for AWS documentation is attached below

    https://docs.aws.amazon.com/AmazonS3/latest/userguide/example-bucket-policies.html

    Login or Signup to reply.
  3. The easiest way to make every object in a bucket publicly accessible is to add this Bucket Policy to the bucket:

    {
        "Version": "2008-10-17",
        "Statement": [
            {
                "Effect": "Allow",
                "Principal": "*",
                "Action": "s3:GetObject",
                "Resource": "arn:aws:s3:::BUCKET-NAME/*"
            }
        ]
    }
    

    This will make every bucket publicly accessible, if they know the filename (Key) of the object.

    To add this Bucket Policy, you will need to turn off S3 Block Public Access (the two options that mention ‘Bucket Policy’).

    Login or Signup to reply.
Please signup or login to give your own answer.
Back To Top
Search