I’d like to set up a React hook (or functional component) to read data in from gzip-compressed file, using the pako
library to extract its data, and then render it in the browser.
Edit: The compressed binary file is local to the project. It is not hosted on a web server elsewhere.
A simple test I had in mind looked similar to this:
import { memo } from 'react';
import compressedFile from "./data.json.gz";
import './App.css';
const pako = require('pako');
const App = memo(({}) => {
try {
const compressedBytes = new Uint8Array(compressedFile);
const uncompressedStr = pako.inflate(compressedBytes, { to: 'string' });
return (
<div>
Data: <pre>{uncompressedStr}</pre>
</div>
);
} catch (err) {
return (
<div>
Error: <pre>{err}</pre>
</div>
);
}
});
export default App
However, this did not work due to the following error:
Uncaught SyntaxError: Invalid or unexpected token (at data.json.gz?import:1:1)
The file data.json.gz
is in the same directory as App
and is just a gzip-compressed test file, e.g.:
$ echo -e '{"a":123,"b":456}' | gzip -c > data.json.gz
Can import
be used to read in binary data? Is there another way to read in a binary file either directly or indirectly into a byte array I can process with pako
?
Please note that I am looking for an answer specifically about working directly with a binary file, and not a base-64 or other re-encoded file that is a string.
2
Answers
As far as I know, the module system expects a file structure that can be parsed into a format that JavaScript can understand and execute. Binary files do not fall into that category by default.
I can think of two ways to deal with this issue:
Configuring file-loader from Webpack.
Or simply by using fetch:
You can use
untar-js
andpako
to extract and read a tarball in the browser.Here is an example with the Maven binary:
Note: See the
extractFiles
function near the bottom.