Using Logstash to process syslog

I read an article https://anderikistan.com/2016/03/26/elk-palo-alto-networks/ and I am now trying to setup Logstash to process Palo Alto logs. However, I am running into 1 issue. I am using a plain Ubuntu 16.04.2 LTS install with the latest version of ElasticSearch, Kibana and Logstash (5.4.1). I am using the pan-traffic.conf and elasticsearch-template.json files described in the article. Here they are:
https://pastebin.com/raw/1iaF3yvv for elasticsearch-template.json
https://pastebin.com/raw/9gwTR5TP for pan-traffic.conf

I had to get a newer version of geolite since the format changed. I got it here: http://dev.maxmind.com/geoip/geoip2/geolite2/ Otherwise, I followed the instructions exactly as described above

I am getting the errors as soon as I start logstash:

[2017-06-07T18:03:45,225][ERROR][logstash.outputs.elasticsearch] Failed to install template. {:message=>"Got response code '400' contacting Elasticsearch at URL 'http://localhost:9200/_template/logstash'", :class=>"LogStash::Outputs::ElasticSearch::HttpClient::Pool::BadResponseCodeError", :backtrace=>[
"/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-7.3.4-java/lib/logstash/outputs/elasticsearch/http_client/manticore_adapter.rb:80:in `perform_request'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-7.3.4-java/lib/logstash/outputs/elasticsearch/http_client/pool.rb:269:in `perform_request_to_url'",
"/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-7.3.4-java/lib/logstash/outputs/elasticsearch/http_client/pool.rb:257:in `perform_request'", 
"/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-7.3.4-java/lib/logstash/outputs/elasticsearch/http_client/pool.rb:347:in `with_connection'", 
"/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-7.3.4-java/lib/logstash/outputs/elasticsearch/http_client/pool.rb:256:in `perform_request'", 
"/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-7.3.4-java/lib/logstash/outputs/elasticsearch/http_client/pool.rb:264:in `put'", 
"/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-7.3.4-java/lib/logstash/outputs/elasticsearch/http_client.rb:325:in `template_put'", 
"/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-7.3.4-java/lib/logstash/outputs/elasticsearch/http_client.rb:82:in `template_install'", 
"/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-7.3.4-java/lib/logstash/outputs/elasticsearch/template_manager.rb:29:in `install'", 
"/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-7.3.4-java/lib/logstash/outputs/elasticsearch/template_manager.rb:9:in `install_template'", 
"/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-7.3.4-java/lib/logstash/outputs/elasticsearch/common.rb:62:in `install_template'", 
"/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-7.3.4-java/lib/logstash/outputs/elasticsearch/common.rb:29:in `register'", 
"/usr/share/logstash/logstash-core/lib/logstash/output_delegator_strategies/shared.rb:9:in `register'", 
"/usr/share/logstash/logstash-core/lib/logstash/output_delegator.rb:41:in `register'", 
"/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:268:in `register_plugin'", 
"/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:279:in `register_plugins'", 
"org/jruby/RubyArray.java:1613:in `each'", 
"/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:279:in `register_plugins'", 
"/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:288:in `start_workers'", 
"/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:214:in `run'", 
"/usr/share/logstash/logstash-core/lib/logstash/agent.rb:398:in `start_pipeline'"]}

[2017-06-07T18:04:23,655][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"pan-traffic", :_type=>"syslog", :_routing=>nil}, 

2017-06-07T23:04:23.000Z ussyslog01 %{message}], :response=>{"index"=>{"_index"=>"pan-traffic", "_type"=>"syslog", "_id"=>"AVyEllTfUFwpLY8k9q0R", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse [DestinationGeo.location]", "caused_by"=>{"type"=>"json_parse_exception", "reason"=>"Current token (START_OBJECT) not numeric, can not use numeric value accessors\n at 
[Source: org.elasticsearch.common.bytes.BytesReference$MarkSupportingStreamInputWrapper@467f0099; line: 1, column: 906]"}}}}}

Any ideas on what could be going wrong?
Thank you.

What if you try to post the index template to ES by hand using e.g. curl? If nothing else you might get a better error message.

What's the command to do that?

curl -XPOST http://localhost:9200/_template/logstash -d @path-to-your-template-file.json

See https://www.elastic.co/guide/en/elasticsearch/reference/current/indices-templates.html for details.

root@ussyslog01:/usr/share/logstash/bin# curl -XPOST http://localhost:9200/_template/logstash -d /opt/logstash/elasticsearch-template.json

{"error":"Content-Type header [application/x-www-form-urlencoded] is not supported","status":406}

The curl command gives the error above. Any ideas?

Your command is missing a "@".

Ah, I see, sorry about that.

Here's the output:

root@ussyslog01:/var/log# curl -XPOST http://localhost:9200/_template/logstash -d @/opt/logstash/elasticsearch-template.json

{"error":{"root_cause":[{"type":"mapper_parsing_exception","reason":"Mapping definition for [location] has unsupported parameters:  [geohash : true] [lat_lon : true]"}],"type":"mapper_parsing_exception","reason":"Failed to parse mapping [_default_]: Mapping definition for [location] has unsupported parameters:  [geohash : true] [lat_lon : true]","caused_by":{"type":"mapper_parsing_exception","reason":"Mapping definition for [location] has unsupported parameters:  [geohash : true] [lat_lon : true]"}},"status":400}

Perhaps the geohash parameter isn't supported in ES 5?

Any ideas on how I can fix it to work with ES5?

@magnusbaeck You were right, lat_lon and geohash properties are no longer supported in ES 5.
I found this post describing the exact same error I am having with the same exact json file:
same issue with Palo Alto logs processing on StackOverflow

I followed the advice there and now the json file looks like this:
https://pastebin.com/Qh1S4j0b

Now starting logstash works:

root@ussyslog01:/var/log# curl -XPOST http://localhost:9200/_template/logstash -d @/opt/logstash/elasticsearch-template.json
{"acknowledged":true}

However, I am now running into another issue after Logstash starts up:

[2017-06-09T12:19:46,672][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://localhost:9200/]}}
[2017-06-09T12:19:46,676][INFO ][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>http://localhost:9200/, :path=>"/"}
[2017-06-09T12:19:46,769][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>#<URI::HTTP:0x4d6f65b3 URL:http://localhost:9200/>}
[2017-06-09T12:19:46,770][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>"/opt/logstash/elasticsearch-template.json"}
[2017-06-09T12:19:46,837][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-*", "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"_default_"=>{"_all"=>{"enabled"=>true}, "dynamic_templates"=>[{"message_field"=>{"match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"string", "index"=>"analyzed", "omit_norms"=>true}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"string", "index"=>"analyzed", "omit_norms"=>true, "fields"=>{"raw"=>{"type"=>"string", "index"=>"not_analyzed", "ignore_above"=>256}}}}}], "properties"=>{"@version"=>{"type"=>"string", "index"=>"not_analyzed"}, "geoip"=>{"type"=>"object", "dynamic"=>true, "properties"=>{"location"=>{"type"=>"geo_point"}}}, "SourceGeo"=>{"type"=>"object", "dynamic"=>true, "properties"=>{"location"=>{"type"=>"geo_point"}}}, "DestinationGeo"=>{"type"=>"object", "dynamic"=>true, "properties"=>{"location"=>{"type"=>"geo_point"}}}}}}}}
[2017-06-09T12:19:46,840][INFO ][logstash.outputs.elasticsearch] Installing elasticsearch template to _template/logstash
[2017-06-09T12:19:46,896][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>[#<URI::Generic:0x2808fede URL://localhost:9200>]}
[2017-06-09T12:19:46,947][INFO ][logstash.filters.geoip   ] Using geoip database {:path=>"/opt/logstash/GeoLiteCity.dat"}
[2017-06-09T12:19:46,971][INFO ][logstash.filters.geoip   ] Using geoip database {:path=>"/opt/logstash/GeoLiteCity.dat"}
[2017-06-09T12:19:46,974][INFO ][logstash.pipeline        ] Starting pipeline {"id"=>"main", "pipeline.workers"=>8, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>5, "pipeline.max_inflight"=>1000}
[2017-06-09T12:19:47,131][INFO ][logstash.pipeline        ] Pipeline main started
[2017-06-09T12:19:47,173][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}
[2017-06-09T12:21:07,449][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"pan-traffic", :_type=>"syslog", :_routing=>nil}, 2017-06-09T17:21:06.000Z ussyslog01 %{message}], :response=>{"index"=>{"_index"=>"pan-traffic", "_type"=>"syslog", "_id"=>"AVyNqMbtUFwpLY8kFUyz", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse [DestinationGeo.location]", "caused_by"=>{"type"=>"json_parse_exception", "reason"=>"Current token (START_OBJECT) not numeric, can not use numeric value accessors\n at [Source: org.elasticsearch.common.bytes.BytesReference$MarkSupportingStreamInputWrapper@1cf4a580; line: 1, column: 933]"}}}}}

The template loads but I can't seem to index because something else is broken. Do you see where I made an error?

What does the DestinationGeo.location field look like in that event? Use a stdout { codec => rubydebug } output.

What's the mapping of that field? Use the ES get mapping API.

There is an open Issue that asks for Logstash to better show the mapping error that Elasticsearch throws at it, you might want to think of upvoting the Issue: https://github.com/logstash-plugins/logstash-output-elasticsearch/issues/333#issuecomment-282274531

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.