I have a beats>logstash>elasticearch>kibana pipeline to parse logfiles, some lines of which contain location data, but they all have a field for location data. Attempting to convert the empty fields to a geo_point gives me a parse exeption and does not index that line. To fix this I am trying to only convert the fields to geo_point if the field is populated. I am, however, having some issues with determining if the field is populated or not. I know that my code is currently comparing it to the string "null", however, if I remove the quotes or replace it with "", I get an unexpected character error. Here is my code
input{
beats{ port => 5044}
#stdin{}
}
filter{
dissect{
mapping => {
"message" => "%{timestamp}|%{+timestamp}|%{level}|%{application}|%{module}|%{latitude}|%{longitude}|%{heading}|%{speed}|%{distance}|
%{pulse}|%{thread}|%{cpu}|%{freq}|%{mem}|%{text}"
}
#convert_datatype =>{
# speed => "float"
#freq => "float"
#}
}
date{
match => ["timestamp", "yyyy-MM-dd'|'HH:mm:ss.SSS", "yyyy-MM-dd|HH:mm:ss.SSS", "ISO8601"]
}
mutate{
remove_field => [message]
}
if [longitude] != "null" {
mutate{
convert => {"latitude" => "float"}
convert => {"longitude" => "float"}
rename => {
"longitude" => "[location][lon]"
"latitude" => "[location][lat]"
}
}
}
}
output{
elasticsearch {
hosts => ["localhost:9200"]
index => "log_index-%{+YYYY.MM.dd}"
}
stdout {}
}