Invalid Config File

When I run --configtest I get the following error:

     Reason: Expected one of #, at line 45, column 4 after output {
      
      if [type] == "test2"
      {
      elasticsearch {
                     hosts => "localhost:9200"
                     action => "index"
                     index => "test2"
                     workers => 29
       }
       stdout {
       codec => rubydebug 
      }
     }
     ES Mapping:
  
    POST /test2
    {
    "settings" : {
               "number_of_shards" : 1
    },
    "mappings" : {
    "test2": {
                  
                 "properties" : {
                 
                  "location" : { "type": '"geo_point"},
                  "city":{"type":"string"},
                  "country":{"type":"string"}
                 }
               }
           }
       }

}

Please post your configuration.

input {
        file {
               path => "path to file"
               type => "test2"
               start_position => "beginning"
              ignore_older => 0
              sincedb_path => "/dev/null"
       }
}

filter {
         if [type] == "test2"
         {
          csv {
                columns => [
                  "timestamp"
                   "latitude"
                    "longitude"
                   "estaddress"
                 ]
               separator => ","
             }
          }
}
output {
 if [type] == "test2"
      {
      elasticsearch {
                     hosts => "localhost:9200"
                     action => "index"
                     index => "test2"
                     workers => 29
       }
       stdout {
       codec => rubydebug 
      }
     }
     ES Mapping:

    POST /test2
    {
    "settings" : {
               "number_of_shards" : 1
    },
    "mappings" : {
    "test2": {

                 "properties" : {

                  "location" : { "type": '"geo_point"},
                  "city":{"type":"string"},
                  "country":{"type":"string"}
                 }
               }
           }
       }
}

It looks like your csv columns array needs to have comma's after each element, also your missing a closing brace for the output. When I add those, it tests ok.

input {
file {
path => "path to file"
type => "test2"
start_position => "beginning"
ignore_older => 0
sincedb_path => "/dev/null"
}
}

filter {
if [type] == "test2"
{
csv {
columns => [
"timestamp",
"latitude",
"longitude",
"estaddress"
]
separator => ","
}
}
}
output {
if [type] == "test2"
{
elasticsearch {
hosts => "localhost:9200"
action => "index"
index => "test2"
workers => 29
}
stdout {
codec => rubydebug
}
}
}

I edited the config file. I'm still having same issue. Is the ES mapping section correct?

On kibana when I try to use tile map:

The "test2" index pattern does not contain any of the following filed types: geo_point

Is the ES mapping section correct?

There's an extra single quote that doesn't belong there. Otherwise it looks okay, but your current Logstash configuration doesn't insert any geo data in the location field. So, make sure the mapping is correctly set and modify your Logstash filters to correctly insert the lat/lon data into the geo_point field.

Ok typos are fixed but how would I modify logstash to insert the lat/long data?

See https://www.elastic.co/guide/en/elasticsearch/reference/current/geo-point.html for a list of how a geo_point value can be represented for Elasticsearch to recognize it as geo_point. Use Logstash filters like mutate to make sure your location field (the intended geo_point field) will be understood by ES. For example,

mutate {
  add_field => {
    "location" => "%{latitude},%{longitude}"
  }
}

should work.

Here are the changes I made after  csv {



}

if [latitude] and [longitude] {
       mutate {
                      add_field => [ "[location", "%{longitude}" ]
                      add_field => [ "[location", "%{latitude}" ]
       }
       mutate {
                      convert => [ "[location]", "geo_point" ]
                      }
                      }
}

output ......

I am still getting expected one of # error after if [type] == "test2"

add_field => [ "[location", "%{longitude}" }

Change to one of the following:

add_field => [ "location", "%{longitude}" ]
add_field => { "location" => "%{longitude}" }
add_field => [ "[location]", "%{longitude}" ]
add_field => { "[location]" => "%{longitude}" }
convert => [ "[location]", "geo_point" ]

As documented, the mutate filter's convert option doesn't support the geo_point type.

If I change it to

convert => [ "[location]", "float" ]

I am still getting expected one of #, => at line 42, column 5 after filter {
if [type] == "test2"

Without seeing your full configuration file I can't help.

input {
        file {
               path => "path to file"
               type => "test2"
               start_position => "beginning"
               ignore_older => 0
               sincedb_path => "/dev/null"
       }
}

filter {
         if [type] == "test2"
         {
          csv {
                  columns => [
                  "timestamp"
                   "latitude"
                    "longitude"
                   "estaddress"
                 ]
               separator => ","
             }
}

if [latitude] and [longitude] {
   mutate {
                  add_field => [ "[location", "%{longitude}" ]
                  add_field => [ "[location", "%{latitude}" ]
   }
   mutate {
                  convert => [ "[location]", "float" ]
                  }
                  }
}
          }
}
output {
 if [type] == "test2"
      {
      elasticsearch {
                     hosts => "localhost:9200"
                     action => "index"
                     index => "test2"
                     workers => 29
       }
       stdout {
       codec => rubydebug 
      }
 
     }
     ES Mapping:

    POST /test2
    {
    "settings" : {
               "number_of_shards" : 1
    },
    "mappings" : {
    "test2": {

                 "properties" : {

                  "location" : { "type": '"geo_point"},
                  "city":{"type":"string"},
                  "country":{"type":"string"}
                 }
               }
           }
       }
}
}

Wait, what? The "ES Mapping:" part is included in your Logstash configuration file? Remove it!

input {
        file {
               path => "path to file"
               type => "test2"
               start_position => "beginning"
               ignore_older => 0
               sincedb_path => "/dev/null"
       }
}

filter {
          csv {
                  columns => [
                  "timestamp"
                   "latitude"
                    "longitude"
                   "estaddress"
                 ]
               separator => ","
             }
}

if [latitude] and [longitude] {
   mutate {
                  add_field => [ "[location", "%{longitude}" ]
                  add_field => [ "[location", "%{latitude}" ]
   }
   mutate {
                  convert => [ "[location]", "float" ]
                  }
                  }
}
          }
}
output {
      elasticsearch {
                     hosts => "localhost:9200"
                     action => "index"
                     index => "test2"
                     workers => 29
       }
       stdout {
       codec => rubydebug 
      }
}

This configuration file works but on Kibana not mapping geo cordinates. It states "test2" index pattern does not contain geo_point field. Where do I declare geo_point on the config file.

You need to create an index template that will get applied to the indexes you'll be creating. In that template you can configure the location field as geo_point. The elasticsearch output has options related to index templates.

Ok I got it to run with new index.json template but still getting the no Geo_point error

my config file:

input {
        file {
               path => "path to file"
               type => "test2"
               start_position => "beginning"
               ignore_older => 0
               sincedb_path => "/dev/null"
       }
}

filter {
          csv {
                  columns => [
                  "timestamp"
                   "latitude"
                    "longitude"
                   "estaddress"
                 ]
               separator => ","
             }
}

if [latitude] and [longitude] {
   mutate {
                  add_field => [ "[location", "%{longitude}" ]
                  add_field => [ "[location", "%{latitude}" ]
   }
   mutate {
                  convert => [ "[location]", "float" ]
                  }
                  }
}
          }
}
output {
      elasticsearch {
                     hosts => "localhost:9200"
                     action => "index"
                     index => "test2"
                     workers => 29
                     manage _template => true
                     template => "path to template,json"
                     template_overwrite => "true"
       }
       stdout {
       codec => rubydebug 
      }
}

My template.json

{
     "template" : "test2"
     "settings" : {
                   "index.refresh_interval" : "5s"
       },
       "mappings" : {
                            "_default_" : {
                           "_all_" : { "enabled" : true, "omit_norms" : true },
                           "properties" : {
                          "@timestamp" : { "type" :  "date", "doc_values" : true},
                          "location" : {
                                      "type" :  "geo_point",
                                      "dynamic" : true,
                                       "properties" : {
                                       "latitude" : { "type" : "float", "doc_values" : true },
                                        "longitude: : {"type" : "float", "doc_values" : true }
                                 }
                           }
                       }
                   }
           }
}

Did you delete and recreate the test2 index after getting the index template in place?

I don't think so. I re-ran it on cmd but is there a command line function to delete the previous index?

Yes, of course: https://www.elastic.co/guide/en/elasticsearch/reference/current/indices-delete-index.html