Logstash Output wrong - Error in visualization [esaggs] > "field" is a required parameter

Hi,

I recieve in Kibana the following error:

"[esaggs] > Saved "field" parameter is now invalid. Please select a new field."

In logstash log i see also:

[2020-01-17T06:10:41,665][WARN ][logstash.outputs.elasticsearch][main] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"winlogbeat-2020.01.14", :routing=>nil, :_type=>"_doc"}, #LogStash::Event:0x71629e42], :response=>{"index"=>{"_index"=>"winlogbeat-2020.01.14", "_type"=>"_doc", "_id"=>"ar3psW8BXwWkOg89GOdx", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse field [winlog.event_data.param1] of type [date] in document with id 'ar3psW8BXwWkOg89GOdx'. Preview of field's value: 'Netzwerkeinrichtungsdienst'", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"failed to parse date field [Netzwerkeinrichtungsdienst] with format [strict_date_optional_time||epoch_millis]", "caused_by"=>{"type"=>"date_time_parse_exception", "reason"=>"Failed to parse with all enclosed parsers"}}}}}}
[2020-01-17T06:10:41,666][WARN ][logstash.outputs.elasticsearch][main] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"winlogbeat-2020.01.14", :routing=>nil, :_type=>"_doc"}, #LogStash::Event:0x2c296414], :response=>{"index"=>{"_index"=>"winlogbeat-2020.01.14", "_type"=>"_doc", "_id"=>"a73psW8BXwWkOg89GOdx", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse field [winlog.event_data.param1] of type [date] in document with id 'a73psW8BXwWkOg89GOdx'. Preview of field's value: 'Netzwerkeinrichtungsdienst'", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"failed to parse date field [Netzwerkeinrichtungsdienst] with format [strict_date_optional_time||epoch_millis]", "caused_by"=>{"type"=>"date_time_parse_exception", "reason"=>"Failed to parse with all enclosed parsers"}}}}}}

I have a Windows Server 2019 machine from where I want to ship all logs with Winlogbeat correctly to logstash and then into elasticsearch.

V.7.5.1 (whole ELK stack and Winlogbeat).

The problem is probably in the logstash output:

output {
if [@metadata][beat] {
elasticsearch {
hosts => ["http://localhost:9200"]
manage_template => true
index => "%{[@packetbeat][beat]}-%{+YYYY.MM.dd}"
document_type => "%{[@metadata][type]}"
}
} else {
elasticsearch {
hosts => ["http://localhost:9200"]
index => "pf-%{+YYYY.MM.dd}"
}
}
}

Can anyone help? I have replaced "index => "%{[@packetbeat][beat]}-%{+YYYY.MM.dd}" " with different setups but it doesn't work.

It looks like you have a problem with date parsing, can you provide your filter configuration ?

Which filter do you mean?

# 01-inputs.conf
input {
  udp {
 port => 5140
  }
beats {
port => 5044
}
}
filter {
  #Adjust to match the IP address of pfSense or OPNsense
  if [host] =~ /192\.XXXX\.XXX\.XXX/ {
mutate {
  add_tag => ["pf", "Ready"]
}
  }
  #To enable or ingest multiple pfSense or OPNsense instances uncomment the below section
  ##############################
  #if [host] =~ /172\.2\.22\.1/ {
  #  mutate {
  #    add_tag => ["pf-2", "Ready"]
  #  }
  #}
  ##############################
  if "pf" in [tags] {
grok {
  # OPNsense - Enable/Disable the line below based on firewall platform
 # match => { "message" => "%{SYSLOGTIMESTAMP:pf_timestamp} %{SYSLOGHOST:pf_hostname} %{DATA:pf_program}(?:\[%{POSINT:pf_pid}\])?: %{GREEDYDATA:pf_message}" }
  # OPNsense
  # pfSense - Enable/Disable the line below based on firewall platform
    match => { "message" => "%{SYSLOGTIMESTAMP:pf_timestamp} %{DATA:pf_program}(?:\[%{POSINT:pf_pid}\])?: %{GREEDYDATA:pf_message}" }
  # pfSense
  add_field => [ "received_at", "%{@timestamp}" ]
  add_field => [ "received_from", "%{host}" ]
}
mutate {
  rename => { "[message]" => "[event][original]"}
  remove_tag => "Ready"
}
  }
}
~

I don't have any other filter concerning winlogbeats as far as I know. Also those filters here are only for Pfsense.

I have now deleted everything (Index, saved object and everything). I started logstash again...
Winlogbeat is created as index pattern automatically and also the Dashboard. I still got the errors. It's because the @timestamp is not correctly added and stored in the visualize part.
Maybe a bug or i'm to stupid ;).

Solution: I have added the @timestamp to the visualizations and saved them. Now I have the Dashboard working. In Logstash I don't see an error.

Output in Logstash:

output {
if [@metadata][beat] {
elasticsearch {
hosts => ["http://localhost:9200"]
manage_template => true
index => "%{[@metadata][beat]}-%{[@metadata][version]}"
}
} else {
elasticsearch {
hosts => ["http://localhost:9200"]
index => "pf-%{+YYYY.MM.dd}"
}
}
}

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.