Hi,
Recently, I had to face a project where I had to locate on a map the different issues we have.
I've been trying for a week how to do it with logstash but I don't know why is not working...
Here you have my ".conf":
input {
file {
path => "/opt/test05.csv"
sincedb_path => "/dev/null"
mode => "read"
ignore_older => "29 d"
file_completed_action => "delete"
}
}
filter {
grok {
patterns_dir => ["/opt/paterns"]
match => {
"message" => "^%{DATA:errorid};%{USERNAME:errorcode};%{USERNAME:clientid};%{DATA:latlon:longitude};%{DATA:latlon:latitude};%{INT:connec_status};%{MINEDATE:erroruptime};%{IPV4:ip}$"
}
}
geoip {
source => "latlon"
target => "geoip"
add_field => [ "[geoip][coordinates]", "%{[geoip][longitude]}" ]
add_field => [ "[geoip][coordinates]", "%{[geoip][latitude]}" ]
}
mutate {
convert => {
"[connec_status]" => "integer"
"[latlon]" => "float"
"[geoip][latitude]" => "float"
"[geoip][longitude]" => "float"
"[geoip][coordinates]" => "float"
}
copy => {
"p05latlon" => "[geoip][coordinates]"
}
}
date {
match => [ "erroruptime", "dd/MM/YYYY HH:mm" ]
locale => en
remove_field => ["timestamp"]
}
if ("_grokparsefailure" in [tags]) {
drop{}
}
}
output {
elasticsearch {
hosts => ["127.0.0.1:9200"]
index => "test05"
}
stdout {}
}
The data I introduce:
829616001458;CD0678-TR1;34678792401;63.607171; 10.801033;1;24/10/2019 9:50;1XX.2X.1XX.1XX
829616001468;S68254-TR2;34686339212;63.584253; 10.736047;1;24/10/2019 15:20;1XX.2X.1XX.5XX
829616001486;S53348-TR1;34686334689;63.789019; 9.709261;1;24/10/2019 14:44;1XX.2X.1XX.5X
829616001498;S23192-TR1;34662624163;63.839019; 9.939261;1;24/10/2019 9:31;1XX.2X.1XX.2XX
And Kibana:
(I covered or edit some sensible data that has nothing to do with the error)
The thing is that it says the location is on Arabia Saudi but is not even close... and for some reason there is no parameter that recognize the location for the Kibanas "visualization".
I know maybe this can look stupid issue but as I could not solve it with what I found on the net, I ask it here.
Thank you.