I am streaming data to the browser like this:
res.writeHead(200, {
'Content-Type': 'text/event-stream',
'Content-Disposition': 'attachment; filename="data.dat"',
'Cache-Control': 'no-cache',
'Connection': 'keep-alive'
})
stream1.pipe(res, {end: false)};
stream1.on('end', () => {
console.log("stream 1 finished");
stream2.pipe(res, {end: false});
stream2.on('end', () => {
console.log("last stream finished");
res.end();
}
};
On Firebase Functions Emulator this works fine. The download starts immediately. curl -v
immediately shows the response headers and starts downloading.
But when I deploy the function to production, the same code behaves differently. The download does not start immediately. curl -v
doesn’t even show the response headers.
It seams that the download for the client starts only after the server is completely done writing all streams. And when the streams are large, the client gets Error: could not handle the request
, the logs have no errors and suggest that the cloud function finished writing all streams.
Perhaps it is some buffering configuration problem like this? -> https://stackoverflow.com/a/66656773/176336
2
Answers
Currently (Aug 2022) streaming is not possible from Firebase Functions because of the buffering implementation.
My workaround was to write the file to Firebase Storage and redirect the browser to download from there.
You can refer to the thread :
As mentioned in the thread :
For Request, you can refer to the Documentation where explained how HTTP functions accept the HTTP request methods listed on the page HTTP triggers. Your HTTP handler can inspect the request method and perform different actions based on the method.
For workaround, you can refer to the thread, mentioned as :