Hi, I have a log file structured like this :
2019-06-20T05:39:27,568 | INFO | qtp441473107-156663 | UID1
2019-06-20T05:39:28,874 | INFO | qtp441473107-156785 | UID1
2019-06-20T05:39:28,879 | INFO | qtp441473107-156786 | UID2
2019-06-20T05:39:28,880 | INFO | qtp441473107-156787 | UID2
I want to retrieve in just one event all the fields which have the same UID and display them in kibana discover. I am using filebeat and logstash 7.0.1. Thank you.
config file logastash:
input {
beats {
port => 5044
}
}
filter {
dissect {
mapping => {
"message" => "%{date}|%{info}|%{ticks}|%{UID}"
}
}
}
output {
elasticsearch {
hosts => ["http://localhost:9200"]
index => "filebeat-pcloulouze-%{+yyyy.MM.dd}"
}
}
Use an aggregate filter. Either example 3 or example 4.
1 Like
Hi, I have a log file structured like this :
2019-06-20T05:39:27,568 | INFO | INPUT| UID1
2019-06-20T05:39:28,874 | INFO | INPUT| UID2
2019-06-20T05:39:28,879 | INFO | OUTPUT | UID1
2019-06-20T05:39:28,880 | INFO | OUTPUT | UID2
I want to retrieve in just one event all the fields which have the same UID and display them in kibana discover like this:
@timestamp Jun 25, 2019 @ 14:26:33.245
valUID: UID1
valIN: INPUT
valOUT: OUTPUT
valinfo: INFO
@timestamp Jun 25, 2019 @ 14:26:33.246
valUID: UID2
valIN: INPUT
valOUT: OUTPUT
valinfo: INFO
. I am using filebeat and logstash 7.0.1. Thank you.
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.