I am trying to import a large JSON file which contains millions of records and a file size of about 10GB. I wonder what is the best way to do this?
The dataset is available here: https://opentender.eu/all/download
I initialy tried to split my JSON file into individual files containing just one record and use filebeats to upload them to elasticsearch but I feel there has to be a better way to do it...
Thank you!