I realize I should have a better understanding of elastic, indexes, mappings, logstash and beats before coming here asking for help, but I really need to put this behind me. I have spent days searching and reading about geoip filter, geoip-info pipeline, creating private databases for lookups and clearly I don't have the knowledge to make it happen. I don't need a complex solution, I need a very simple one line statement (If possible!)
I have a simple setup of ES. Following the small business blog post I have configured packetbeat to include the private geoip data for source and destination addresses. The configuration is simple easy to understand and works.
- add_fields:
when.network.destination.ip: private
fields:
destination.geo.location:
lat: 40.7128
lon: -74.0060
destination.geo.continent_name: North America
destination.geo.country_iso_code: US
destination.geo.region_name: New York
destination.geo.region_iso_code: US-NY
destination.geo.city_name: New York City
target: ''
I would like to take that same approach, using logstash filter, mutate or anything that would work and add the private geoip data for source and destination. Could someone please show me how to complete this config file so that it will be the same as above? Here is my logstash config.
Obviously once I can get logstash to correctly modify the source and destination information I can remove all the settings at the beat level. I hope that someone can complete the configuration file above. I can't imagine there isn't a way to populate the same data as the beat does. Any help you can give me I would greatly appreciate.
Thank you for your response. I have tried what you suggested which yields the following results in the logstash logs.
{"type"=>"mapper_parsing_exception",
"reason"=>"failed to parse field [destination.geo.location] of type [geo_point]",
"caused_by"=>{"type"=>"parse_exception", "reason"=>"longitude must be a number"}}}}}
Really appreciate your response. Added, same result.
"error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse field [destination.geo.location] of type [geo_point]", "caused_by"=>{"type"=>"parse_exception", "reason"=>"longitude must be a number"}}}}}
Thanks again. If I remove all of filter statements at the logstash level and add them to each individual beat configuration (as I posted originally) everything works as it should. I'm just trying to avoid updating hundreds of beat configs on multiple machines on a private network.
Genius. Thank you so much for helping me through that. I've made the changes and it's working without error. Hopefully it will yield the results I wanted from the beginning. Again, I thank you!
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.