Save your clients internet, Serving big JSON dataset files over network like a pro

save-your-clients-internet,-serving-big-json-dataset-files-over-network-like-a-pro

Ever needed to server big JSON files over the network (like 100+ MB files).

The efficient way we can handle this problem is by converting the JSON files to binary and then send it to the client.

Firs lets just convert JSON to .gz file and see the size difference.

Comparison:
Actual file size – 107MB, compressed .gz file size – 3.2MB.

Linux command to convert JSON to .gz

gzip filename.json It will convert your file to the .gz file.

Now you can easily server this file over the network and the client will have to decode the file from .gz file to JSON.

A useful library to convert binary data to string is (https://www.npmjs.com/package/pako)[pako].

Use below code in React.js to convert .gz to .json file.


import { inflate } from "pako";

const parse = (bin: any) => inflate(bin, { to: "string" })

export const fetchJSON = () => {
  return fetch("/data/data.json.gz").then(
    async (response) => {
      const blob = await response.blob();
      readSingleFile(blob).then((res) => {
        let jsonData = parse(rs);
        console.log(jsonData);
        return jsonData;
      });
  });
};

async function readSingleFile(file: File | Blob) {
  return new Promise((resolve, reject) => {
    const reader = new FileReader();
    reader.onload = function (e) {
      resolve(e?.target?.result);
    };
    reader.readAsArrayBuffer(file);
    reader.onabort = reject;
    reader.onerror = reject;
  });
};

*Why this article even make sense: *

Well recently I was working with deck.gl and had to render big JSON dataset files (over 150+ MB). If the files were locally present that’s OK but imagine serving those dataset files from a CDN 😨😨.
For same problem I researched over the internet about how to serve large dataset files efficiently/optimize in deck.gl and found nothing, and at last I end up with doing binary conversion and decoding those files in browser to render map contents.

I know this is not the optimal approach but If anyone has a better approach or has any experience with deck.gl of rendering large datasets. Please comment down below.

Thanks.

Total
0
Shares
Leave a Reply

Your email address will not be published. Required fields are marked *

Previous Post
best-capm-exam-prep-tools-for-2023

Best CAPM Exam Prep Tools for 2023

Next Post
using-es6-proxy-for-cross-cut-concerns-–-a-real-world-example

Using ES6 Proxy for Cross-cut Concerns – A Real-world Example

Related Posts