How to push .log file to elasticsearch

Hi all,
I have .log file with following content. values separated by [tab].

|2018-10-04-00-15-17|10.1.1.100|8080|10.1.1.105|1|
|2018-10-04-00-15-20|10.1.1.10|80|10.2.1.15|1|
|2018-10-04-00-15-31|10.4.1.20|50560|10.1.1.105|1|
|2018-10-04-00-15-48|10.1.1.10|8080|10.4.1.21|1|
|2018-10-04-00-15-60|10.1.1.100|8080|10.1.1.105|1|


I have to give fields name for these values and push it in to elasticsearch.
The fields should be following.
time src port dst count
I don't know how to write logstash pipeline logic for push data to elasticsearch.

My Expected output
{
"time":"2018-10-04-00-15-17",
"src":"10.1.1.100",
"port":"8080",
"dst":"10.1.1.105",
"count":1
}

Thanks,
Sundar.

You need

  • a file input,
  • a csv filter (with the separator option set to \t), and
  • an elasticsearch output.

There are lots of blog posts that show complete examples.

Ok that's fine. How to map the fields with appropriate values?

there is good grok example for matching.

https://www.elastic.co/guide/en/logstash/6.2/config-examples.html

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.