I’m attempting to upload files through a web form to a Google Drive backend using the Resumable Uplaods feature. I’m starting with a script from https://github.com/tanaikech/AsynchronousResumableUploadForGoogleDrive/
I’ve spent much time on this without success. What I’d like to do is upload 1,000 files. The way that the async script works now is that all 1,000 will upload concurrently. What I’d like to do is upload (for example) 10 files concurrently, then repeat for the next 10, and so on until done. I believe my issue is that I’ll need to modify https://cdn.jsdelivr.net/gh/tanaikech/ResumableUploadForGoogleDrive_js@master/resumableupload_js.min.js
(included in index.html
). I’ve already gotten the files
split into chunks of 10, but when doing a chunk.forEach...
, the async nature of https://cdn.jsdelivr.net/gh/tanaikech/ResumableUploadForGoogleDrive_js@master/resumableupload_js.min.js
still uploads them all in parallel.
What should I be focusing on to change to only limit 10 async responses, and wait until done, then repeat?
To reproduce, use the setup instructions in the README.md from the repo I linked above. Except, use the code below for index.html
.
Note the <insert folder ID>
in the index.html
file below would need to be updated to your own folder ID for this to work.
<input type="file" id="file" multiple="true" />
<input type="button" onclick="run()" value="Upload" />
<div id="progress"></div>
<script>src="https://cdn.jsdelivr.net/gh/tanaikech/ResumableUploadForGoogleDrive_js@master/resumableupload_js.min.js"</script>
<script>
function run() {
google.script.run.withSuccessHandler(accessToken => ResumableUploadForGoogleDrive(accessToken)).getAuth();
}
function ResumableUploadForGoogleDrive(accessToken) {
const f = document.getElementById("file");
const chunkSize = 10;
const totalChunks = Math.ceil(f.files.length / chunkSize);
console.log("The totalChunks are: " + totalChunks);
for (let i = 0; i < totalChunks; i++) {
const startIndex = i * chunkSize;
console.log("The startIndex is: " + startIndex);
const endIndex = startIndex + chunkSize;
console.log("The endIndex is: " + endIndex);
const chunk = Array.from(f.files).slice(startIndex, endIndex);
console.log("The chunk is: " + chunk);
chunk.forEach((file, i) => {
if (!file) return;
let fr = new FileReader();
fr.fileName = file.name;
fr.fileSize = file.size;
fr.fileType = file.type;
fr.readAsArrayBuffer(file);
fr.onload = e => {
var id = "p" + ++i;
var div = document.createElement("div");
div.id = id;
document.getElementById("progress").appendChild(div);
document.getElementById(id).innerHTML = "Initializing.";
const f = e.target;
const resource = { fileName: fr.fileName, fileSize: fr.fileSize, fileType: fr.fileType, fileBuffer: fr.result, accessToken: accessToken, folderId: "<insert folder ID>" };
const ru = new ResumableUploadToGoogleDrive();
ru.Do(resource, function (res, err) {
if (err) {
console.log(err);
return;
}
console.log(res);
let msg = "";
if (res.status == "Uploading") {
msg = Math.round((res.progressNumber.current / res.progressNumber.end) * 100) + "% (" + fr.fileName + ")";
} else {
msg = res.status + " (" + fr.fileName + ")";
}
document.getElementById(id).innerText = msg;
});
};
});
};
};
</script>
2
Answers
In your situation, how about the following sample script?
Before you use this script, please enable Drive API at Advanced Google services.
Google Apps Script:
code.gs
HTML & Javascript:
index.html
Please modify your folder ID to
folderId: "root",
.Testing:
When this script is run, the following result is obtained. In this case,
const n = 2;
is used. So, the files are uploaded every 2 files with the asynchronous process.References:
ALTERNATIVE SOLUTION
This is another method by which you may try to
upload 1,000 files
, but limit the script to10 files concurrently
at a time.Here’s an example:
This is the modified version of
index.html
:REFERENCE