kmk3173
(Kate)
August 24, 2016, 7:50pm
1
When I run --configtest I get the following error:
Reason: Expected one of #, at line 45, column 4 after output {
if [type] == "test2"
{
elasticsearch {
hosts => "localhost:9200"
action => "index"
index => "test2"
workers => 29
}
stdout {
codec => rubydebug
}
}
ES Mapping:
POST /test2
{
"settings" : {
"number_of_shards" : 1
},
"mappings" : {
"test2": {
"properties" : {
"location" : { "type": '"geo_point"},
"city":{"type":"string"},
"country":{"type":"string"}
}
}
}
}
}
Please post your configuration.
kmk3173
(Kate)
August 24, 2016, 8:28pm
3
input {
file {
path => "path to file"
type => "test2"
start_position => "beginning"
ignore_older => 0
sincedb_path => "/dev/null"
}
}
filter {
if [type] == "test2"
{
csv {
columns => [
"timestamp"
"latitude"
"longitude"
"estaddress"
]
separator => ","
}
}
}
output {
if [type] == "test2"
{
elasticsearch {
hosts => "localhost:9200"
action => "index"
index => "test2"
workers => 29
}
stdout {
codec => rubydebug
}
}
ES Mapping:
POST /test2
{
"settings" : {
"number_of_shards" : 1
},
"mappings" : {
"test2": {
"properties" : {
"location" : { "type": '"geo_point"},
"city":{"type":"string"},
"country":{"type":"string"}
}
}
}
}
}
GardenMWM
(Matthew Marshall)
August 24, 2016, 8:38pm
4
It looks like your csv columns array needs to have comma's after each element, also your missing a closing brace for the output. When I add those, it tests ok.
input {
file {
path => "path to file"
type => "test2"
start_position => "beginning"
ignore_older => 0
sincedb_path => "/dev/null"
}
}
filter {
if [type] == "test2"
{
csv {
columns => [
"timestamp",
"latitude",
"longitude",
"estaddress"
]
separator => ","
}
}
}
output {
if [type] == "test2"
{
elasticsearch {
hosts => "localhost:9200"
action => "index"
index => "test2"
workers => 29
}
stdout {
codec => rubydebug
}
}
}
kmk3173
(Kate)
August 24, 2016, 8:44pm
5
I edited the config file. I'm still having same issue. Is the ES mapping section correct?
On kibana when I try to use tile map:
The "test2" index pattern does not contain any of the following filed types: geo_point
Is the ES mapping section correct?
There's an extra single quote that doesn't belong there. Otherwise it looks okay, but your current Logstash configuration doesn't insert any geo data in the location
field. So, make sure the mapping is correctly set and modify your Logstash filters to correctly insert the lat/lon data into the geo_point field.
kmk3173
(Kate)
August 25, 2016, 2:10pm
7
Ok typos are fixed but how would I modify logstash to insert the lat/long data?
See https://www.elastic.co/guide/en/elasticsearch/reference/current/geo-point.html for a list of how a geo_point value can be represented for Elasticsearch to recognize it as geo_point. Use Logstash filters like mutate to make sure your location
field (the intended geo_point field) will be understood by ES. For example,
mutate {
add_field => {
"location" => "%{latitude},%{longitude}"
}
}
should work.
kmk3173
(Kate)
August 25, 2016, 2:25pm
9
Here are the changes I made after csv {
}
if [latitude] and [longitude] {
mutate {
add_field => [ "[location", "%{longitude}" ]
add_field => [ "[location", "%{latitude}" ]
}
mutate {
convert => [ "[location]", "geo_point" ]
}
}
}
output ......
I am still getting expected one of # error after if [type] == "test2"
add_field => [ "[location", "%{longitude}" }
Change to one of the following:
add_field => [ "location", "%{longitude}" ]
add_field => { "location" => "%{longitude}" }
add_field => [ "[location]", "%{longitude}" ]
add_field => { "[location]" => "%{longitude}" }
convert => [ "[location]", "geo_point" ]
As documented, the mutate filter's convert option doesn't support the geo_point type.
kmk3173
(Kate)
August 25, 2016, 2:41pm
11
If I change it to
convert => [ "[location]", "float" ]
I am still getting expected one of #, => at line 42, column 5 after filter {
if [type] == "test2"
Without seeing your full configuration file I can't help.
kmk3173
(Kate)
August 25, 2016, 5:15pm
13
input {
file {
path => "path to file"
type => "test2"
start_position => "beginning"
ignore_older => 0
sincedb_path => "/dev/null"
}
}
filter {
if [type] == "test2"
{
csv {
columns => [
"timestamp"
"latitude"
"longitude"
"estaddress"
]
separator => ","
}
}
if [latitude] and [longitude] {
mutate {
add_field => [ "[location", "%{longitude}" ]
add_field => [ "[location", "%{latitude}" ]
}
mutate {
convert => [ "[location]", "float" ]
}
}
}
}
}
output {
if [type] == "test2"
{
elasticsearch {
hosts => "localhost:9200"
action => "index"
index => "test2"
workers => 29
}
stdout {
codec => rubydebug
}
}
ES Mapping:
POST /test2
{
"settings" : {
"number_of_shards" : 1
},
"mappings" : {
"test2": {
"properties" : {
"location" : { "type": '"geo_point"},
"city":{"type":"string"},
"country":{"type":"string"}
}
}
}
}
}
}
Wait, what? The "ES Mapping:" part is included in your Logstash configuration file? Remove it!
kmk3173
(Kate)
August 25, 2016, 6:30pm
15
input {
file {
path => "path to file"
type => "test2"
start_position => "beginning"
ignore_older => 0
sincedb_path => "/dev/null"
}
}
filter {
csv {
columns => [
"timestamp"
"latitude"
"longitude"
"estaddress"
]
separator => ","
}
}
if [latitude] and [longitude] {
mutate {
add_field => [ "[location", "%{longitude}" ]
add_field => [ "[location", "%{latitude}" ]
}
mutate {
convert => [ "[location]", "float" ]
}
}
}
}
}
output {
elasticsearch {
hosts => "localhost:9200"
action => "index"
index => "test2"
workers => 29
}
stdout {
codec => rubydebug
}
}
This configuration file works but on Kibana not mapping geo cordinates. It states "test2" index pattern does not contain geo_point field. Where do I declare geo_point on the config file.
You need to create an index template that will get applied to the indexes you'll be creating. In that template you can configure the location
field as geo_point. The elasticsearch output has options related to index templates.
kmk3173
(Kate)
August 25, 2016, 6:57pm
17
Ok I got it to run with new index.json template but still getting the no Geo_point error
my config file:
input {
file {
path => "path to file"
type => "test2"
start_position => "beginning"
ignore_older => 0
sincedb_path => "/dev/null"
}
}
filter {
csv {
columns => [
"timestamp"
"latitude"
"longitude"
"estaddress"
]
separator => ","
}
}
if [latitude] and [longitude] {
mutate {
add_field => [ "[location", "%{longitude}" ]
add_field => [ "[location", "%{latitude}" ]
}
mutate {
convert => [ "[location]", "float" ]
}
}
}
}
}
output {
elasticsearch {
hosts => "localhost:9200"
action => "index"
index => "test2"
workers => 29
manage _template => true
template => "path to template,json"
template_overwrite => "true"
}
stdout {
codec => rubydebug
}
}
My template.json
{
"template" : "test2"
"settings" : {
"index.refresh_interval" : "5s"
},
"mappings" : {
"_default_" : {
"_all_" : { "enabled" : true, "omit_norms" : true },
"properties" : {
"@timestamp" : { "type" : "date", "doc_values" : true},
"location" : {
"type" : "geo_point",
"dynamic" : true,
"properties" : {
"latitude" : { "type" : "float", "doc_values" : true },
"longitude: : {"type" : "float", "doc_values" : true }
}
}
}
}
}
}
Did you delete and recreate the test2 index after getting the index template in place?
kmk3173
(Kate)
August 25, 2016, 11:32pm
19
I don't think so. I re-ran it on cmd but is there a command line function to delete the previous index?