Monday, May 20, 2024
 Popular · Latest · Hot · Upcoming
49
rated 0 times [  54] [ 5]  / answers: 1 / hits: 39156  / 7 Years ago, sat, march 18, 2017, 12:00:00

Currently, I am using the @google-cloud/storage NPM package to upload a file directly to a Google Cloud Storage bucket. This requires some trickery as I only have the image's base64 encoded string. I have to:




  • Decode the string

  • Save it as a file

  • Send the file path to the below script to upload to Google Cloud Storage

  • Delete the local file



I'd like to avoid storing the file in the filesystem altogether since I am using Google App Engine and I don't want to overload the filesystem / leave junk files there if the delete operation doesn't work for whatever reason. This is what my upload script looks like right now:



// Convert the base64 string back to an image to upload into the Google Cloud Storage bucket
var base64Img = require('base64-img');
var filePath = base64Img.imgSync(req.body.base64Image, 'user-uploads', 'image-name');

// Instantiate the GCP Storage instance
var gcs = require('@google-cloud/storage')(),
bucket = gcs.bucket('google-cloud-storage-bucket-name');

// Upload the image to the bucket
bucket.upload(__dirname.slice(0, -15) + filePath, {
destination: 'profile-images/576dba00c1346abe12fb502a-original.jpg',
public: true,
validation: 'md5'
}, function(error, file) {

if (error) {
sails.log.error(error);
}

return res.ok('Image uploaded');
});


Is there anyway to directly upload the base64 encoded string of the image instead of having to convert it to a file and then upload using the path?


More From » node.js

 Answers
34

The solution, I believe, is to use the file.createWriteStream functionality that the bucket.upload function wraps in the Google Cloud Node SDK.



I've got very little experience with streams, so try to bear with me if this doesn't work right off.



First of all, we need take the base64 data and drop it into a stream. For that, we're going to include the stream library, create a buffer from the base64 data, and add the buffer to the end of the stream.



var stream = require('stream');
var bufferStream = new stream.PassThrough();
bufferStream.end(Buffer.from(req.body.base64Image, 'base64'));


More on decoding base64 and creating the stream.



We're then going to pipe the stream into a write stream created by the file.createWriteStream function.



var gcs = require('@google-cloud/storage')({
projectId: 'grape-spaceship-123',
keyFilename: '/path/to/keyfile.json'
});

//Define bucket.
var myBucket = gcs.bucket('my-bucket');
//Define file & file name.
var file = myBucket.file('my-file.jpg');
//Pipe the 'bufferStream' into a 'file.createWriteStream' method.
bufferStream.pipe(file.createWriteStream({
metadata: {
contentType: 'image/jpeg',
metadata: {
custom: 'metadata'
}
},
public: true,
validation: md5
}))
.on('error', function(err) {})
.on('finish', function() {
// The file upload is complete.
});


Info on file.createWriteStream, File docs, bucket.upload, and the bucket.upload method code in the Node SDK.



So the way the above code works is to define the bucket you want to put the file in, then define the file and the file name. We don't set upload options here. We then pipe the bufferStream variable we just created into the file.createWriteStream method we discussed before. In these options we define the metadata and other options you want to implement. It was very helpful to look directly at the Node code on Github to figure out how they break down the bucket.upload function, and recommend you do so as well. Finally, we attach a couple events for when the upload finishes and when it errors out.


[#58488] Thursday, March 16, 2017, 7 Years  [reply] [flag answer]
Only authorized users can answer the question. Please sign in first, or register a free account.
savanar

Total Points: 237
Total Questions: 105
Total Answers: 99

Location: Wales
Member since Mon, May 17, 2021
3 Years ago
;