Hello,
I have CSV file with two geolocation columns:
cell_easting cell_northing
26.1541 66.48703
26.161 66.49312
26.166 66.49182
I would like to insert them as a markers into a map
Here my config file:
> input {
> file {
> path => "/home/ahmed/Desktop/tfJuni_mini.csv"
> start_position => "beginning"
> sincedb_path => "/dev/null"
> }
> }
> filter {
> csv {
> separator => ","
> columns => ["cell_easting", "cell_northing", "subsperbase", "date_trunc"]
> }
> date { match => [ "date_trunc", "dd.MM.yyyy HH:mm:ss" ] }
> mutate {convert => ["subsperbase", "integer"] }
>
> if [cell_easting] and [cell_northing] {
> mutate {
> add_field => { "location" => "%{cell_northing}" }
> add_field => { "location" => "%{cell_easting}" }
> }
> mutate { convert => [ "[location]", "geo_point" ] }
> }
> }
>
> output {
> elasticsearch {
> hosts => ["http://elastic:changeme@127.0.0.1:9200"]
> index => "tfjuni"
> document_type => "tfJuni"
> }
> stdout {}
> }
Here is the command line to execute:
sudo /usr/share/logstash/bin/logstash --path.settings=/etc/logstash/ -f /home/ahmed/Desktop/TF.conf
The prob is that I don't find my index ?
any problem with the config file ?
If Logstash has problems sending data to Elasticsearch you'll typically find clues in the logs.
I'm not sure you can list the username and password in the ES URL.
Here is the log of :/var/log/logstash/logstash-plain.log
017-11-14T20:38:14,592][ERROR][logstash.agent ] Pipeline aborted due to error {:exception=>#<LogStash::ConfigurationError: translation missing: en.logstash.agent.configuration.invalid_plugin_register>, :backtrace=>["/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-filter-mutate-3.1.6/lib/logstash/filters/mutate.rb:190:in `register'", "org/jruby/RubyHash.java:1342:in `each'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-filter-mutate-3.1.6/lib/logstash/filters/mutate.rb:184:in `register'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:290:in `register_plugin'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:301:in `register_plugins'", "org/jruby/RubyArray.java:1613:in `each'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:301:in `register_plugins'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:311:in `start_workers'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:235:in `run'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:408:in `start_pipeline'"]}
[2017-11-14T20:38:14,697][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
[2017-11-14T20:38:17,654][WARN ][logstash.agent ] stopping pipeline {:id=>".monitoring-logstash"}
[2017-11-14T20:38:19,237][WARN ][logstash.agent ] stopping pipeline {:id=>"main"}
mutate { convert => [ "[location]", "geo_point" ] }
As documented, geo_point is not a valid type for conversions. Just a few minutes ago I responded to another question related to geo_point. That answer should be useful to you too.
I have changed the conf file like this, but the same prob:
input {
file {
path => "/home/ahmed/Desktop/tfJuni_mini.csv"
start_position => "beginning"
sincedb_path => "/dev/null"
}
}
filter {
csv {
separator => ","
columns => ["cell_easting", "cell_northing", "subsperbase", "date_trunc"]
}
date { match => [ "date_trunc", "dd.MM.yyyy HH:mm:ss" ] }
mutate {convert => ["subsperbase", "integer"] }
mutate { convert => [ "[cell_easting]", "geo_point" ] }
mutate { convert => [ "[cell_northing]", "geo_point" ] }
}
output {
elasticsearch {
hosts => ["http://elastic:changeme@127.0.0.1:9200 "]
index => "tfjuni"
document_type => "tfJuni"
}
stdout {}
}
mutate { convert => [ "[cell_easting]", "geo_point" ] }
mutate { convert => [ "[cell_northing]", "geo_point" ] }
Again, this won't work. geo_point is not a valid conversion type for the mutate filter.
How can I read these values as a geo-location value ?
Is it possible as a float type ?
So dId you read the other post I referred you to?
YES I have read it, the prob is to convert from csv columns to geo_ip
magnusbaeck
(Magnus Bäck)
November 15, 2017, 12:58pm
10
I've continued the old topic (Create geopoint data ). You're pretty much asking the same thing so please follow that topic.
OK here is my problem,
I have add a new field, look to my config file:
input {
file {
path => "/home/ahmed/Desktop/tfJuni_mini.csv"
start_position => "beginning"
sincedb_path => "/dev/null"
}
}
filter {
csv {
separator => ","
columns => ["cell_easting", "cell_northing", "subsperbase", "date_trunc"]
}
mutate {convert => ["subsperbase", "integer"] }
mutate {
add_field => [ "[geoip][location]","%{cell_easting}" ]
add_field => [ "[geoip][location]","%{cell_northing}" ]
}
mutate {convert => {"[geoip][location]" => "float"} }
}
output {
elasticsearch {
hosts => ["http://elastic:changeme@127.0.0.1:9200 "]
index => "tfjuni"
document_type => "tfJuni"
}
stdout { codec => rubydebug }
}
In kibana the data shows fine,
But in the visualization section there is an error:
You're not following the advice I just gave in the other topic. Pay attention to the last paragraph.
Tanks for your replay, I have add a template to mapping the field
curl -XPUT 'localhost:9200/_template/tfjuni?pretty' -H 'Content-Type: application/json' -d'
{
"index_patterns": ["tfjuni"],
"settings": {
"number_of_shards": 1
},
"mappings": {
"type1": {
"_source": {
"enabled": false
},
"properties": {
"geo.location": {
"type": "geo_point"
},
"created_at": {
"type": "date",
"format": "EEE MMM dd HH:mm:ss Z YYYY"
}
}
}
}
}
I got this as error:
{
"error" : {
"root_cause" : [
{
"type" : "action_request_validation_exception",
"reason" : "Validation Failed: 1: template is missing;"
}
],
"type" : "action_request_validation_exception",
"reason" : "Validation Failed: 1: template is missing;"
},
"status" : 400
}
I really got confused with this.
The ES version is 5.6.4
"version" : {
"number" : "5.6.4",
"build_hash" : "--- ",
"build_date" : "---",
"build_snapshot" : false,
"lucene_version" : "6.6.1"
},
magnusbaeck
(Magnus Bäck)
November 17, 2017, 10:37am
16
Ok now it works, the template is created
PUT _template/tfjuni
{
"template": "tfjuni",
"settings": {
"number_of_shards": 1
},
"mappings": {
"type1": {
"_source": {
"enabled": false
},
"properties": {
"location": {
"type": "geo_point"
},
"created_at": {
"type": "date",
"format": "EEE MMM dd HH:mm:ss Z YYYY"
}
}
}
}
}
Again the field location doesn't change to geo_point type.
magnusbaeck
(Magnus Bäck)
November 17, 2017, 12:05pm
18
What do the mappings of a newly created index look like?
I have a field with lan/lat coordinates.
I need to map this field as geo_point in ES,
As you said in the other post it is possible by using an index template.
Please answer my question. What do the mappings of a newly created index look like? Use Elasticsearch's get mapping API. Please also show an example document. Copy/paste from the JSON tab in Kibana.