Hi,
I recieve in Kibana the following error:
"[esaggs] > Saved "field" parameter is now invalid. Please select a new field."
In logstash log i see also:
[2020-01-17T06:10:41,665][WARN ][logstash.outputs.elasticsearch][main] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"winlogbeat-2020.01.14", :routing=>nil, :_type=>"_doc"}, #LogStash::Event:0x71629e42], :response=>{"index"=>{"_index"=>"winlogbeat-2020.01.14", "_type"=>"_doc", "_id"=>"ar3psW8BXwWkOg89GOdx", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse field [winlog.event_data.param1] of type [date] in document with id 'ar3psW8BXwWkOg89GOdx'. Preview of field's value: 'Netzwerkeinrichtungsdienst'", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"failed to parse date field [Netzwerkeinrichtungsdienst] with format [strict_date_optional_time||epoch_millis]", "caused_by"=>{"type"=>"date_time_parse_exception", "reason"=>"Failed to parse with all enclosed parsers"}}}}}}
[2020-01-17T06:10:41,666][WARN ][logstash.outputs.elasticsearch][main] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"winlogbeat-2020.01.14", :routing=>nil, :_type=>"_doc"}, #LogStash::Event:0x2c296414], :response=>{"index"=>{"_index"=>"winlogbeat-2020.01.14", "_type"=>"_doc", "_id"=>"a73psW8BXwWkOg89GOdx", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse field [winlog.event_data.param1] of type [date] in document with id 'a73psW8BXwWkOg89GOdx'. Preview of field's value: 'Netzwerkeinrichtungsdienst'", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"failed to parse date field [Netzwerkeinrichtungsdienst] with format [strict_date_optional_time||epoch_millis]", "caused_by"=>{"type"=>"date_time_parse_exception", "reason"=>"Failed to parse with all enclosed parsers"}}}}}}
I have a Windows Server 2019 machine from where I want to ship all logs with Winlogbeat correctly to logstash and then into elasticsearch.
V.7.5.1 (whole ELK stack and Winlogbeat).
The problem is probably in the logstash output:
output {
if [@metadata][beat] {
elasticsearch {
hosts => ["http://localhost:9200"]
manage_template => true
index => "%{[@packetbeat][beat]}-%{+YYYY.MM.dd}"
document_type => "%{[@metadata][type]}"
}
} else {
elasticsearch {
hosts => ["http://localhost:9200"]
index => "pf-%{+YYYY.MM.dd}"
}
}
}
Can anyone help? I have replaced "index => "%{[@packetbeat][beat]}-%{+YYYY.MM.dd}" " with different setups but it doesn't work.