I’m trying to return images to my users to view on the DOM without creating public links to images (This route is protected by my server’s auth before getting images back). This is what I have tried so far, but this might be the wrong way to go about it.
I’m following the Google Cloud Storage with Node.js guide to add image uploading to my project. I now have the upload working and images are saving in Google Cloud Storage. The guide shows how to retrieve a public URL:
https://cloud.google.com/nodejs/getting-started/using-cloud-storage
I’m trying to use file streaming instead of the public URL so that my images aren’t made public. I’m guessing it would look similar to the streams in this repository:
https://github.com/GoogleCloudPlatform/google-cloud-node
But I’m trying to return the images on an HTTP response and not simply store them locally on my file system.
My client side Javascript is getting a response that looks like a giant blob. ����JFIF���Photoshop 3.08BIM�
The metadata makes me think that this is, in fact, the correct image, but I’m not sure how to get it back to looking like an image.
Here is my Node server file. The upload to Google Cloud Storage, the /post
, is working, I can see the images being added.
The /get
is returning what looks like a giant blob that seems like it is the image I want (just as a weird blob that I don’t know how to convert).
My code looks like this:
var express = require('express');
var router = express.Router();
const Multer = require('multer');
const multer = Multer({
storage: Multer.memoryStorage(),
limits: {
fileSize: 5 * 1024 * 1024 // no larger than 5mb, you can change as needed.
}
});
const Storage = require('@google-cloud/storage');
const storage = Storage({
keyFilename: 'server/firebase-service-account.json',
projectId: process.env.FIREBASE_PROJECT_ID
});
const bucket = storage.bucket(process.env.FIREBASE_STORAGE_BUCKET);
// Process the file upload and upload to Google Cloud Storage.
router.post('/', multer.single('file'), (req, res, next) => {
console.log('req.body', req.body);
console.log('req.file', req.file);
if (!req.file) {
res.status(400).send('No file uploaded.');
return;
}
// Create a new blob in the bucket and upload the file data.
const blob = bucket.file(req.file.originalname);
const blobStream = blob.createWriteStream();
blobStream.on('error', (err) => {
next(err);
return;
});
blobStream.on('finish', () => {
res.status(200).send('The image has been successfully uploaded to google cloud storage');
});
blobStream.end(req.file.buffer);
});
router.get('/', function (req, res, next) {
var stream = bucket.file('Toast.jpg').createReadStream();
stream.pipe(res);
stream.on('data', function (data) {
res.write(data);
});
stream.on('error', function (err) {
console.log('error reading stream', err);
});
stream.on('end', function () {
res.set({
"Content-Disposition": 'attachment; filename="Toast.jpg"',
"Content-Type": 'image/jpg'
});
res.end();
});
});
module.exports = router;
Is this close? Is there a better way to secure my images with my auth?
2
Answers
I was thinking about it the wrong way. I needed to just route my
<img src=""
to the correct location.HTML
Server Node
I would still love feedback on whether this is the correct way to solve this problem. (Obviously, the hardcoding will need to go.)
The get image part can be done now with the "download" function, here is my snippet in Typescript: