Hi Elastic Support Manager,
Please help to check how to fix below issues.
I use filebeat send log to logstash-elasticsearch-kibana(host).
Kibana - Version: 5.4.2
logstash 5.5.1
filebeat version 5.5.1
Issues:
(1)Til Map - failed to show locations in map
(2)After set (database => "/etc/logstash/GeoLiteCity.dat") in below logstash config file(10-django-filter.conf),failed to flush data to elasticsearch
logstash worked well without database => "/etc/logstash/GeoLiteCity.dat"
GeoLiteCity.dat was downloaded from http://dev.maxmind.com/geoip/geoip2/geolite2/
=============================================================================
[Logstash filter]
filter {
if [type] == "log" {
grok {
match => {"message" => "%{TIMESTAMP_ISO8601:logDate} ([%{DATA:module}:%{NUMBER:line_no}]) ([%{WORD:django_module}:%{WORD:function_name
}]) ([%{LOGLEVEL:log_level}])-%{GREEDYDATA:uri_path}"}
overwrite => ["message"]
}
}
geoip {
source => "clientip"
target => "geoip"
database => "/etc/logstash/GeoLiteCity.dat"
add_field => [ "[geoip][coordinates]", "%{[geoip][longitude]}" ]
add_field => [ "[geoip][coordinates]", "%{[geoip][latitude]}" ]
}
mutate {
convert => [ "[geoip][coordinates]", "float"]
merge => {"[tags]" => "[fields][tags]"}
remove_field => "[fields][tags]"
}
date {
match => ["logDate", "yyyy-MM-dd HH:mm:ss,SSS"]
timezone => "Asia/Taipei"
target => "@timestamp"
}
}
=============================================================================
[filebeat.yml]
filebeat.prospectors:
-
input_type: log
paths:- /var/log/django/all.log*
document_typ: django
scan_frequency: 5s
geoip:
paths:
- "/usr/share/GeoIP/GeoLiteCity.dat"
=============================================================================
[Input Data]
[2017-09-01T10:56:46,155][DEBUG][logstash.pipeline ] output received {"event"=>{"geoip"=>{"ip"=>"213.171.45.10", "latitude"=>55.7386, "country_name"=>"Russia", "country_code2"=>"RU", "continent_code"=>"EU", "country_code3"=>"RU", "location"=>{"lon"=>37.6068, "lat"=>55.7386}, "longitude"=>37.6068}, "offset"=>240829, "logDate"=>"2017-09-01 18:56:44,597", "module"=>"django.server", "line_no"=>"125", "input_type"=>"log", "log_level"=>"INFO", "django_module"=>"basehttp", "source"=>"/var/log/django/all.log", "message"=>"2017-09-01 18:56:44,597 [django.server:125] [basehttp:log_message] [INFO]- clientip: 213.171.45.10, path: http://1X.1XX.3X.5X:80/phpmanager/", "type"=>"log", "tags"=>["beats_input_codec_plain_applied"], "@timestamp"=>2017-09-01T10:56:44.597Z, "uri_path"=>"http://1X.1XX.3X.5X:80/phpmanager/", "function_name"=>"log_message", "clientip"=>"213.171.45.10", "@version"=>"1", "beat"=>{"hostname"=>"p.com", "name"=>"p.com", "version"=>"5.5.1"}, "host"=>"p.com"}}
=========================================================================================================================================
[Error Log]
[2017-09-01T11:05:05,195][INFO ][logstash.filters.geoip ] Using geoip database {:path=>"/etc/logstash/GeoLiteCity.dat"}
[2017-09-01T11:05:05,271][ERROR][logstash.pipeline ] Error registering plugin {:plugin=>"#<LogStash::FilterDelegator:0x5902246b @id="e415b8ff6331cf6d84009f6d10dde0e6c61705ed-3", @klass=LogStash::Filters::GeoIP, @metric_events=#<LogStash::Instrument::NamespacedMetric:0x254836bc @metric=#<LogStash::Instrument::Metric:0x7cbdbef @collector=#<LogStash::Instrument::Collector:0x4da4c42f @agent=nil, @metric_store=#<LogStash::Instrument::MetricStore:0x5cd908c @store=#<Concurrent:0x000000000626e0 entries=3 default_proc=nil>, @structured_lookup_mutex=#Mutex:0x6154bb5f, @fast_lookup=#<Concurrent:0x000000000626e4 entries=65 default_proc=nil>>>>, @namespace_name=[:stats, :pipelines, :main, :plugins, :filters, :"e415b8ff6331cf6d84009f6d10dde0e6c61705ed-3", :events]>, @logger=#<LogStash::Logging::Logger:0x53772eb9 @logger=#Java::OrgApacheLoggingLog4jCore::Logger:0x4660248>, @filter=<LogStash::Filters::GeoIP source=>"clientip", target=>"geoip", database=>"/etc/logstash/GeoLiteCity.dat", add_field=>{"[geoip][coordinates]"=>["%{[geoip][longitude]}", "%{[geoip][latitude]}"]}, id=>"e415b8ff6331cf6d84009f6d10dde0e6c61705ed-3", enable_metric=>true, periodic_flush=>false, cache_size=>1000, lru_cache_size=>1000, tag_on_failure=>["_geoip_lookup_failure"]>>", :error=>"The database provided is invalid or corrupted."}
[2017-09-01T11:05:05,271][DEBUG][logstash.filters.grok ] closing {:plugin=>"LogStash::Filters::Grok"}
[2017-09-01T11:05:05,309][ERROR][logstash.agent ] Pipeline aborted due to error {:exception=>java.lang.IllegalArgumentException: The database provided is invalid or corrupted., :backtrace=>["org.logstash.filters.GeoIPFilter.(org/logstash/filters/GeoIPFilter.java:67)", "java.lang.reflect.Constructor.newInstance(java/lang/reflect/Constructor.java:423)", "RUBY.register(/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-filter-geoip-4.2.1-java/lib/logstash/filters/geoip.rb:116)", "RUBY.register(/usr/share/logstash/vendor/jruby/lib/ruby/1.9/forwardable.rb:201)", "RUBY.register_plugin(/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:281)", "RUBY.register_plugins(/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:292)", "org.jruby.RubyArray.each(org/jruby/RubyArray.java:1613)", "RUBY.register_plugins(/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:292)", "RUBY.start_workers(/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:302)", "RUBY.run(/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:226)", "RUBY.start_pipeline(/usr/share/logstash/logstash-core/lib/logstash/agent.rb:398)", "java.lang.Thread.run(java/lang/Thread.java:748)"]} - /var/log/django/all.log*
Thanks,
Ashley