Location to geo_point mapping not working with logtash and kibana

Hello,

I am trying to plot a group of store locations on a map, i have zip and lat and long of the store.

version being used is 6.4.2

I am covert the location feild to geo_point using logstash output for elastic and specifying custom template.

My Logstash cofig

input {
file {
path => "/opt/ELK/log_lat.txt"
start_position => "beginning"
sincedb_path => "/dev/null"
tags => ["lati"]
}
}
filter {
if "lati" in [tags]{
csv {
separator => ","
columns => ["ZIP","LAT","LNG"]
}

if [LAT] and [LNG] {
mutate {
add_field => {
"[location][lat]" => "%{LAT}"
"[location][lon]" => "%{LNG}"
}
}
mutate {
convert => {
"[location][lat]" => "float"
"[location][lon]" => "float"
}
}
}

}
}

output {

if "lati" in [tags]{

elasticsearch {
manage_template => "false"
template => "/tmp/template_mapping.json"
hosts => "http://localhost:9200"
index => "xstorelat-index"
}
}
stdout {}
}

My template file

{
"index_patterns": ["xstorelat*"],
"settings": {
"number_of_shards": 1
},
"mappings": {
"_doc": {
"_source": {
"enabled": false
},
"properties": {
"ZIP": {
"type": "text"
},
"location": {
"type": "geo_point"
}
}
}
}
}

there is no error but when the data is in the elasticsearch after the run the location feild does not change to geo_point its still as below.

     "location": {
        "properties": {
          "lat": {
            "type": "float"
          },
          "lon": {
            "type": "float"
          }
        }
      }

Please help

1 Like

Looks like you are performing a float conversion on the values and they stay that way once Elasticsearch ingests them. Going off what I know works (there's many ways to do it), the below should work for you, it stores the lat, lon values as a single field in string format.

Pipeline Config

input {
  file {
    path => "/opt/ELK/log_lat.txt"
    start_position => "beginning"
    sincedb_path => "/dev/null"
    tags => ["lati"]
  }
}
filter {
  if "lati" in [tags]{
    csv {
      separator => ","
      columns => ["ZIP","LAT","LNG"]
    }
    if [LAT] and [LNG] {
      mutate {
        add_field => {
        "[location][coordinates]" => "%{LAT}, %{LNG}"
        }
      }
    }
  }
}

Index Template

{
  "index_patterns": ["xstorelat*"],
  "settings": {
    "number_of_shards": 1
  },
  "mappings": {
    "_doc": {
      "_source": {
        "enabled": false
      },
      "properties": {
        "ZIP": {
          "type": "text"
        },
        "location": {
          "properties": {
            "coordinates": {
              "type": "geo_point"
            }
          }
        }
      }
    }
  }
}

Thank you for response.
Its still not updating the type.

Blockquote

Using mapping template from {:path=>"/tmp/template_mapping.json"}
[2018-12-16T13:13:09,805][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"index_patterns"=>["xstorelat*"], "settings"=>{"number_of_shards"=>1}, "mappings"=>{"_doc"=>{"_source"=>{"enabled"=>false}, "properties"=>{"ZIP"=>{"type"=>"text"}, "loc"=>{"properties"=>{"coordinates"=>{"type"=>"geo_point"}}}}}}}}
[2018-12-16T13:13:10,079][INFO ][logstash.pipeline ] Pipeline started successfully {:pipeline_id=>"main", :thread=>"#<Thread:0x3b0e0d6d run>"}
[2018-12-16T13:13:10,137][INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>}
[2018-12-16T13:13:10,154][INFO ][filewatch.observingtail ] START, creating Discoverer, Watch with file and sincedb collections
[2018-12-16T13:13:10,534][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}

Blockquote

{
  "index_patterns": ["xstorelat*"],
  "settings": {
    "number_of_shards": 1
  },
  "mappings": {
    "_doc": {
	  "_source": {
        "enabled": false
       },
       "properties": { 
         "ZIP": {
            "type": "text"
          },
          "loc": {
             "properties": {
                "coordinates": {
                  "type": "geo_point"
                }  
    	      }
          }
       }
    }
  }
 }

Blockquote

input {
  file {
    path => "/opt/ELK/log_lat.txt"
    start_position => "beginning"
    sincedb_path => "/dev/null"
    tags => ["lati"]
  }
}
filter {
if "lati" in [tags]{
  csv {
    separator => ","
    columns => ["ZIP","LAT","LNG"]
  }


 if [LAT] and [LNG] {
    mutate {
      add_field => {
        "[loc][coordinates]" => "%{LAT}, %{LNG}"
      }
    }
  }

  }
}

output {

 if "lati" in [tags]{

   elasticsearch {
   index => "xstorelat-index"
   template => "/tmp/template_mapping.json"
   hosts => "http://localhost:9200"
 }
}
stdout {}
}

Blockquote

The type now after mapping is

 "loc": {
            "properties": {
              "coordinates": {
                "type": "text",
                "fields": {
                  "keyword": {
                    "type": "keyword",
                    "ignore_above": 256
                  }
                }
              }
            }

This is pretty straight forward, and there are many posts with the same problems that can be read to make this work. Ex. please read: https://www.elastic.co/blog/geoip-in-the-elastic-stack

I have followed most of the post, most of them is for geoip.

I am not using geoip, i have feeding in the lat and long fields directly from a CSV format.

Have you tried manually updating the index in the dev console?

interesting. you want to use geo_point, but you dont want to use geo_point data type. I applaud your efforts on trying something new and venture outside of what works. good luck.

I dont think you understand the problem here, i am trying to convert location type array for float in logstash filter to geo_point using custom template.

Your post is for a geoip filter , which is not what my input is.

I am not interested in your sarcasm skills, if you do not have solution please don't comment.

There is no sarcasm. I don't think you understand how to load geo-point data into elastic so that it can be consumed. If you read the articles it will show you how. But many people do not want to read documents. They want an answer to how they have developed their custom data and get it to work.

Yes thanks for input, I am browsing through the articles.

I like I said if you don't have a solution, please don't comment, if I knew how it worked if would have no reason to add a topic here.

I am glad you are reading and understanding. I am happy that you are reading the instruction manual vs trying to build something and then realizing it doesnt work.

Hello Updated info here what iam doing now, still not working though some progress

  1. I have a template in elasticsearch "pravin_template"
  2. I have made logstash output elasticsearch to explicitly use above template.
  3. The index is being created successfully by logstash with geo_point field now.
  4. There is no data populated to the index, not sure why there is no error in logstash logs.

Blockquote

Logstash config

input {
  file {
    path => "/opt/ELK/log_lat.txt"
    start_position => "beginning"
    sincedb_path => "/dev/null"
    tags => ["lati"]
  }
}
filter {
  if "lati" in [tags]{
    csv {
      separator => ","
      columns => ["ZIP","LAT","LNG"]
    }


    if [LAT] and [LNG] {


       mutate {
          add_field => { "[location-pra][lat]" => "%{[LAT]}" }
          add_field => { "[location-pra][lon]" => "%{[LNG]}" }
      }
      mutate {
        convert => {"[location-pra][lat]" => "float"}
        convert => {"[location-pra][lon]" => "float"}
     }

   }

 }
}

output {

 if "lati" in [tags]{

   elasticsearch {
       index => "xstorelat-index"
       manage_template => "false"
       document_type => "_doc"
       template_name=>"pravin_template"
       hosts => "http://localhost:9200"
     }
}
stdout {}
}

Blockquote

template in elasticsearch pushed using dev tools

PUT _template/pravin_template
{
"index_patterns": [
"xstorelat-index"
],
"settings": {
"number_of_shards": 1
},
"mappings": {
"_doc": {
"_source": {
"enabled": false
},
"properties": {
"@timestamp": {
"type": "date"
},
"@version": {
"type": "text"
},
"ZIP": {
"type": "text"
},
"LAT": {
"type": "text"
},
"LNG": {
"type": "text"
},
"location-pra": {
"type": "geo_point"
},
"host": {
"type": "text"
},
"message": {
"type": "text"
},
"path": {
"type": "text"
},
"tags": {
"type": "text"
}
}
}
}
}

Blockquote

index created successfully

{
"xstorelat-index": {
"mappings": {
"pra_doc": {
"_source": {
"enabled": false
},
"properties": {
"@timestamp": {
"type": "date"
},
"@version": {
"type": "text"
},
"LAT": {
"type": "text"
},
"LNG": {
"type": "text"
},
"ZIP": {
"type": "text"
},
"host": {
"type": "text"
},
"location-pra": {
"type": "geo_point"
},
"message": {
"type": "text"
},
"path": {
"type": "text"
},
"tags": {
"type": "text"
}
}
}
}
}
}

There is no data in the index except for the time field.

Finally got it to work !!!!!!!!!!!!!!

step 1 create your own template and put it in elasticsearch

PUT _template/pravin_template
    {
      "index_patterns": ["xstorelat-*"],
      "settings": {
        "index": {
            "refresh_interval": "5s"
        }
      },
      "mappings": {
        "_doc": {
          "dynamic_templates": [
              {
                "message_field": {
                  "path_match": "message",
                  "match_mapping_type": "string",
                  "mapping": {
                    "type": "text",
                    "norms": false
                  }
                }
              },
              {
                "string_fields": {
                  "match": "*",
                  "match_mapping_type": "string",
                  "mapping": {
                    "type": "text",
                    "norms": false,
                    "fields": {
                      "keyword": {
                        "type": "keyword",
                        "ignore_above": 256
                      }
                    }
                  }
                }
              }
            ],
          "properties": {
            "@timestamp": {
              "type": "date"
            },
            "@version": {
              "type": "keyword"
            },
            "location-pra": {
              "type": "geo_point"
            },
            "tags": {
              "type": "keyword"
            }
          }
        }
      },
      "aliases": {}
    }

Make sure you force logstash to use your template.

input {
  file {
    path => "/opt/ELK/log_lat.txt"
    start_position => "beginning"
    sincedb_path => "/dev/null"
    tags => ["lati"]
  }
}
filter {
  if "lati" in [tags]{
    csv {
      separator => ","
      columns => ["ZIP","LAT","LNG"]
    }


    if [LAT] and [LNG] {


       mutate {
          add_field => { "[location-pra][lat]" => "%{[LAT]}" }
          add_field => { "[location-pra][lon]" => "%{[LNG]}" }
      }
      mutate {
        convert => {"[location-pra][lat]" => "float"}
        convert => {"[location-pra][lon]" => "float"}
        convert => {"[ZIP]" => "string"}

     }

   }

 }
}

output {

 if "lati" in [tags]{

   elasticsearch {
       index => "xstorelat-%{+YYYY.MM.dd}"
       manage_template => "false"
       document_type => "_doc"
      # template_overwrite => "true"
       template_name=>"pravin_template"
       hosts => "http://localhost:9200"
     }
}
stdout {}
}

Works like a charm, displays all data in Kibana..

There is a warning on document type option being depcrated in logstash 7.0, may have to look at a way to bypass, as of now the data is in.

2 Likes

Very cool, and nice work on using the geo_point in your template.

Since you used the template api to define your template in your cluster, you can remove your template statement in logstash. The template already knows what indexes to apply this to because of the index pattern reference in the template.

Thank you.
Yes I will remove that.
I am also looking at option to remove the document_type option in output as it is to be depricated.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.