Elastic Search Data format conversion


(Abhinav Dwivedi) #1

Team

I have 1 TB Elastic search data across 4 node cluster is there a way we can convert that data into CSV or Json?

Regards
Abhinav D


(David Pilato) #2

You can probably use logstash for this.


(Abhinav Dwivedi) #3

hi dadoonet

i have 5 Grok filter in logstash . Could you please help me how can i do that is there any document for that?

Regards
Abhinav D


(David Pilato) #4

I don't see the relationship between 5 grok filters and the initial question you asked.

You want to export data from Elasticsearch to a JSON file, right?
This is not related to any filter.


(Abhinav Dwivedi) #5

Yes we have all data 1.1 TB in either single json or multiple Json Or CSV .

Its possible?


(David Pilato) #6

Yes probably with logstash with an elasticsearch input plugin and a file output plugin ?


(Abhinav Dwivedi) #7

Hey

How can i know the data type[like : json,csv etc] for ES snapshot ?

Regards
Abhinav


(David Pilato) #8

If you mean the snapshot and restore feature, it's nothing like that. Let's say it's a binary format.


(Abhinav Dwivedi) #9

Hi Dadoonet

Actually, the problem is we have all the data in ES right now as I already mention. Around 1.1 TB data.

Logstash doesn't have much data its just filter out and pushes the data to ES.

So what i want to convert that 1.1 TB data in Json or csv . So i can put that JSON and CSV in S3 bucket and use one aws tool athena to query the data.

And clear my 1.1 TB and use ELK cluster for only one month data .

Regards
Abhinav


(David Pilato) #10

Use logstash to read from elasticsearch and write to S3.


(Abhinav Dwivedi) #11

hi

is there any document available for that ?

Regards
Abhinav D


(system) #12

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.