Quantcast
Channel: Recent Questions - Stack Overflow
Viewing all articles
Browse latest Browse all 12111

Download and process large json file in a firebase cloud function

$
0
0

I have a large .json.gz file (~2GB when unzipped) that I am storing in a firebase storage bucket and would like to download and process the JSON in a firebase cloud function using Node. Due to the large file size I am attempting to use the stream-json package to stream the file contents. When I deploy the code below to firebase and upload the file the only output I see in the function log is from where I am logging "processing file". I can't tell if the issue is with downloading the file or if it is downloading successfully but failing during parsing. I did try with another file of a different type and got in the case I did see the "file downloaded" message in the logs so it seems that the downloading works at least with smaller files. Any advice would be greatly appreciated, TIA.

  exports.importJSON = onObjectFinalized({ memory: "8GiB", cpu: 2, timeoutSeconds: 600 }, async (event) => {    logger.info("processing file", { structuredData: true });    const fileBucket = event.data.bucket;     const filePath = event.data.name;     var tempFilePath = tmp.tmpNameSync();    const bucket = getStorage().bucket  (fileBucket);    bucket.file(filePath).download({ destination: tempFilePath })    .then(() => {        logger.log('file downloaded');        let stream = fs.createReadStream(tempFilePath);        stream            .pipe(zlib.createGunzip())            .pipe(parser())            .pipe(streamValues())            .on('data', d => processData(d.value))    })});

Viewing all articles
Browse latest Browse all 12111

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>