I'm relatively new to full-stack development, and currently trying to figure out an effective way to send
and fetch
large data between my front-end (React) and back-end (Express) while minimizing memory usage. Specifically, I'm building a mapping app which requires me to play around with large JSON files (10-100mb).
My current setup works for smaller JSON files:
Backend:
const data = require('../data/data.json');
router.get('/', function(req, res, next) {
res.json(data);
});
Frontend:
componentDidMount() {
fetch('/')
.then(res => res.json())
.then(data => this.setState({data: data}));
}
However, if data
is bigger than ~40mb, the backend would crash if I test on local due to running out of memory. Also, holding onto the data with require()
takes quite a bit of memory as well.
I've done some research and have a general understanding of JSON parsing, stringifying, streaming, and I think the answer lies somewhere with using chunked json stream to send the data bit by bit, but am pretty much at a loss on its implementation, especially using a single fetch()
to do so (is this even possible?).
Definitely appreciate any suggestions on how to approach this.