A way to parse huge JSON files when the memory used to be a limitation

Andrei Negruți - Senior Java Developer @ RWS

Studio Room

9th November, 10:30-11:00

Everyone uses JSON files.
Thankfully, most of the time the JSON files we use are small and we can always just read and process everything in memory because it is convenient and easy to do.
Most of the time it is not all the time. Sometimes you must process big JSON files and the moment you try to do this the old-fashioned way you are soon going to see the dreadful “java.lang.OutOfMemoryError.”

One search on the internet and you will find solutions to this problem.
Concisely you will see a variation of these answers:

  • Split your file into smaller ones
  • Increase max memory used (yes, this is one of the answers)
  • Save the JSON in a temporary file and use the streaming capabilities of GSON or Jackson
GSON or Jackson work well but they require you to write a lot of boilerplate code and get your hands dirty with lots of tokens, if checks, path checks etc.
We developed a fourth option, and we were able to abstract away what Jackson can do and create an interface that is easy to understand and interact with.
With its help we managed to deliver increased performance, reduce the memory we need to run our service by more than 50% while also being able to translate an infinite number of paragraphs because now we no longer have the entire file in memory.

Andrei Negruți


I'm a Software Developer in the Trados Enterprise team within RWS Group where I'm responsible for building the best cloud translation management solution. Besides the work that I do at my current company, I like sharpening my skills at home as well by picking up new books, learning about new technologies or doing a Devoxx presentation. The single most stimulating advice that shaped me professionally is leave the code cleaner than you found it, an advice that now encapsulates my work ethic. When I'm away from my laptop I enjoy reading thriller books, playing board games or basketball.