Best way of importing very large JSON files into Elasticsearch

Hi! Im pretty new to the whole ELK stack.

So I have about 10136 json files, all with a minimum of 1000 lines.
Here is an example of one of these json files

What is the best way of inserting this into Elasticsearch?
An index per json file or one index but with many shards or something completely different?

So far I've tried to put everything in one index, but searching gets really slow even with only 500 of these json files. I figured there must be a better way

P.S. it was originally hierarchical xml dat but I got too many mapping errors that I could not resolve without going over each of the xmls one by one.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.