Mapping reports to Elasticsearch

Hello,

I have few reports that needs to be ingested or mapped as a whole file individually into elasticsearch and creating a specific index for all the reports i.e creating a dropdown for each and every individual report. How do I achieve this within ELK and how would I write a script to create a dashboard based on the timeframe that these reports are being ingested. ex: from 2 days earlier to now or specifically mentioning any day or time ??

Thanks,
Joshua.

Depends on what format the reports are in.

Yep, there's a CSV filter in Logstash that would help.

@warkolm,

But, that would ship as multiple events through filebeat or "file" input filter of logstash. In my use case, I have some reports generated by a tool everyday. so, to avoid duplication of data I need to search specific reports that I should get it as entire file instead of getting as events. something like a drop down specifically to query my index using any particular csv file report to get the desired results. do we have any options that I could map my software to Elasticsearch ??

Thanks,
Joshua.

So some reports might be updated day-to-day and they need to be overwritten? That's ok!

You can just take a few fields from the document and create your own Elasticsearch _id. Then Logstash can use that to index the report, so if an update happens it will use those same fields and create the same -id and Elasticsearch will treat it as an update :slight_smile:

@warkolm,

This will definitely help me out. But, how would I check the whole report or search for the previous reports i.e like a list of reports that I wanted to look through for future ref and avoiding the redundancy.
or let me ask this, do we need write a script to get the latest reports ?

Thanks,
Joshua.

You just search by ID or by date or whatever.

I will try that . Thanks @warkolm :slight_smile:

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.