I have configured a logstash pipeline in which I have used filter "logstash-filter-geoip" for getting geo information about the IP Address.I have specified mapping before indexing documents into elasticsearch but it is showing this below error in logstash while indexing documents into elasticsearch.
[WARN ] 2021-03-24 16:35:41.900 [[main]>worker0] elasticsearch - Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"test1", :routing=>nil, :_type=>"_doc"}, #<LogStash::Event:0xc7c54f6>], :response=>{"index"=>{"_index"=>"test1", "_type"=>"_doc", "_id"=>"bwTpY3gB09SioyeoXtLa", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse field [myloc] of type [geo_point]", "caused_by"=>{"type"=>"parse_exception", "reason"=>"field must be either [lat], [lon] or [geohash]"}}}}}
This is the mapping which I have done before indexing documents into elasticsearch.
PUT /test1
{
"mappings": {
"properties": {
"location": {
"type": "geo_point"
}
}
}
}
This is the document which contains geographical information.
{
"myloc" => {
"latitude" => 47.6348,
"region_code" => "WA",
"timezone" => "America/Los_Angeles",
"ip" => "3.7.23.139",
"country_code2" => "US",
"country_code3" => "US",
"longitude" => -122.3451,
"dma_code" => 819,
"region_name" => "Washington",
"country_name" => "United States",
"location" => {
"lon" => -122.3451,
"lat" => 47.6348
},
"city_name" => "Seattle",
"continent_code" => "NA",
"postal_code" => "98109"
},
"@version" => "1",
"@timestamp" => 2021-03-24T11:05:40.821Z,
"host" => "0.0.0.0",
"message" => "3.7.23.139"
}
This is my logstash configuration pipeline:-
input {
stdin {}
}
filter {
geoip {
source => "message"
target => "myloc"
}
}
output {
stdout {}
elasticsearch {
hosts => ["http://localhost:9200"]
user => "${USERNAME}"
password => "${PASSWD}"
index => "test1"
}
}
Please provide some solution.