Getting started with Elastic Search!

I have a bunch of .json files that contain a ~million judgements. I want to build a search engine on a website that queries these files. I am looking for tutorials or something similar that gets this is detail.



Do you have one json document per file?

Hi @dadoonet
All the judgements are in a single .json file (its ~10GB). Here is the schema:

{ 'id' : [ title, body...] }

So you need to write a script which parses every single line and generate an index request from it which you can add to a bulk request.
Every x documents, call the bulk api with the bulk request you created and start again.

Otherwise, you can generate one file per line of your big json file. Then use FSCrawler project to import all the json files. There's an option for that.

Or look at logstash with its file input plugin and its json codec or filter but I'm unsure if that would work.

Is there an elastic cloud solution for the same? In short, I want to upload my data and get an endpoint that I can call from my website. I am not familiar with the ELK stack but I would like something that is already hosted and ready to go since we are on a tight deadline.


look at

Cloud by elastic is one way to have access to all features, all managed by us. Think about what is there yet like Security, Monitoring, Reporting, SQL, Canvas, APM, Logs UI, Infra UI, SIEM, Maps UI and what is coming next :slight_smile: ...

But you will still to write an application which exposes a REST endpoint where you can upload your file and send bulk requests to the cloud instance.


This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.