I have a node function where I’m attaching a pdf from a s3 URL and I keep getting a 403 error. This happens in the program as well as the browser. Here is the Nodemailer function I’m trying to attach the URL with:
transporter.sendMail({
from: '[email protected]',
to: `${email}`, // list of receivers
subject: 'test email', // Subject line
attachments: [{
filename: `${creation}.pdf`,
href: 'https://test-bucket.s3.amazonaws.com/testfolder/test.pdf',
contentType: 'application/pdf'
}],
function (err, info) {
if (err) {
console.error(err)
} else {
console.log(info)
}
},
html: `
<p>Greetings test</p>
` // html body
})
Here is my bucket permissions policy:
{
"Version": "2012-10-17",
"Id": "Policy1720627335118",
"Statement": [
{
"Sid": "Stmt1720627331285",
"Effect": "Allow",
"Principal": {
"AWS": "arn:aws:iam::946520502272:root"
},
"Action": "s3:*",
"Resource": "arn:aws:s3:::test-bucket"
}
]
}
I made that policy to be as open as possible and I have the "block all public access" option off. I’m currently just testing locally. What can I do so that when I copy and paste that url I can access it?
Here is the error I am getting in Node:
Error: Invalid status code 403
2
Answers
When you send an HTTP GET request from your browser for
https://test-bucket.s3.amazonaws.com/testfolder/test.pdf
, that is an unauthenticated request. There are no AWS credentials here at all.The only way this would work is for the object to be public. Your S3 bucket policy, however, does not make that object, or any object, public. You would need to allow GetObject on the object ARN to unathenticated users, for example using:
Note that you can typically avoid making objects public by simply creating and sharing an S3 pre-signed URL for the object. That URL will be usable anywhere an HTTP GET is usable, until the pre-signed URL expires.
If you log into the AWS console and go to the object in s3, you can actually click a drop-down and do "make public".
If you need to automate it, after that, you can check the policy/ACLs to see how it did it and mimic it to ensure you automate it according to their current best practices. (probably easier to copy then read docs in this case).
Also, I believe CloudFront is a better way to achieve this w/ a private bucket.