I tried to get all objects in AWS S3 bucket using @aws-sdk/client-s3.
Instead of getting 1000 object the program exits after 50 object downloads.
In while() loop the obj_list.Contents.length
is equal to 1000 but
the process exits after receiving responses of 50 GetObjectCommand objects.
import { S3Client, ListObjectsV2Command, GetObjectCommand } from "@aws-sdk/client-s3"
(async () => {
const client = new S3Client({
credentials:{
accessKeyId:'XXXXXXXXXXXXXXXXXXXXX',
secretAccessKey:'XXXXXXXXXXXXXXXXXXXXX'
},
region: "us-east-1"
})
const input = {
Bucket: 'Bucket-Name'
}
const cmd = new ListObjectsV2Command(input)
const obj_list = await client.send(cmd)
let i = 0
while (i < obj_list.Contents.length) {
const command = new GetObjectCommand({
Bucket: 'Bucket-Name',
Key: obj_list.Contents[i++].Key
})
client.send(command)
.then(
(data) => {
console.log(`Content length: ${data.ContentLength}`)
},
(error) => {
const { requestId, cfId, extendedRequestId } = error.$$metadata
console.log(`Error: ${requestId}, ${cfId}, ${extendedRequestId}`)
}
)
}
console.log("Done")
})();
console.log("End")
Here is the output in Visual Studio Code console:
C:Program Filesnodejsnode.exe .test.js
End
Done
50
Content length: 38535294
What are the possible reasons of that?
UPD. Here is the code which creates array of Promises 10 by 10, resolve these Promises then create another slice.
No difference – after 50 requests the script exits with status 13:
"Process exited with code 13".
The statuses of all resolved Promises are ‘fulfilled’.
// <list> contains all objects fom the bucket as in the code above
// ...
const step = 50
let i = 0
while (i < obj_list.Contents.length) {
const to = Math.min(i + step, obj_list.Contents.length)
let promises = []
for (let f = i; f < to; ++f) {
promises.push(client.send(
new GetObjectCommand({
Bucket: 'Bucket',
Key: obj_list.Contents[f].Key
})
))
}
const statuses = await Promise.allSettled(promises)
i = to
}
This code exits on await Promise.all(promises)
with exit code 13:
const promises = obj_list.Contents.map(async (obj_cont) => {
const command = new GetObjectCommand({
Bucket: 'Bucket',
Key: obj_cont.Key
})
const data = await client.send(command)
});
const statuses = await Promise.all(promises)
Terminal output:
C:Program Filesnodejsnode.exe .async_batch.js
Process exited with code 13
2
Answers
It seems, you are missing
await
before thesend
method.Refer this example – List objects in an Amazon S3 bucket using an AWS SDK
Update 2023-03-21
I created a working example on my GitHub. It can make it easier for you to compare/debug the code.
Parallel or Sequence exectution of AWS SDK v3 S3 Client
Example of how to use the AWS SDK v3 S3 client to upload files to S3 in parallel or sequence execution.
Requirements
npm install
to install the dependencies.BUCKET
environment variable to a random bucket name.node bucket-create.js
to create the bucket.node bucket-delete.js
to delete the bucket.Parallel
Using
Promise.all
to upload files in parallel.Sequence
Using
while
loop to upload files in parallel.Original answer
This is happening because you are using the callback version of the Promise object within the loop:
The loop finishes before the JS event loop fires all the
.then
callbacks.As suggested by @Ankush Jain, you can use
await
to resolve the promise:This will trigger them sequentially.
If you need to perform the requests in parallel, you can create an array of promises and use
Promise.all
orPromise.allSettled
to await for them.