Logstash not processing new beat


I am upgrading my logging so I have reinstalled server with new ELK stack and rebuilt a system to send logs into (bro & suricata).

Now in Kibana I can see firewall syslog logs from before and a sensor (to be rebuilt) I have running filebeat 5.X something and its logs appear but the new sensor with recent 6.X while traffic is going to the box the logs don't appear. I do have this in logstash logs but not sure if it is related and everytime I try the mutate fix I found around this it breaks all my logging.

[2019-01-31T13:24:58,858][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"logstash-2019.01.31", :_type=>"doc", :routing=>nil}, #LogStash::Event:0x415a661a], :response=>{"index"=>{"_index"=>"logstash-2019.01.31", "_type"=>"doc", "_id"=>"5LwVpGgBZZmAyljHf3pn", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse field [host] of type [text]", "caused_by"=>{"type"=>"illegal_state_exception", "reason"=>"Can't get text on a START_OBJECT at 1:60"}}}}}

I am wondering how I would go about tracing down this issue as I have checked all the common things (error messages, network, connectivity etc). so I think now I am in the logstash to elasticsearch part of it but only that error is occuring but it could be off another system like the firewalls.

Is there an error in the elasticsearch logs at that time?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.