Hi there:)
I'm having a scenario where we are grabbing Events via Winlogbeat which writes them to a RAID 1 NAS for backup reasons. Unfortunately we can't let winlogbeat write directly in EL.
first logstash cfg:
input {
beats {
port => "5044"
}
}
output {
file {
path => "Z://ea/ea_log-%{+YYYY-MM-dd}.json"
}
}
After that, a second machine with logstash grabs them and tries to write them into EL.
second logstash cfg:
input {
file {
path => "Z:/ea/ea_log-*.json"
ignore_older => 8640000
start_position => "beginning"
codec => "json"
}
}
output {
elasticsearch {
hosts => ["localhost:9200"]
index => "ea-%{+YYYY.MM.dd}"
}
}
So everything works fine and kibana sees a lot of entries.
BUT when it comes down to parse the "message" field we are experiencing _jsonparsefailure.
I know that we forced this problem by setting the codec to JSON and the message field is plain text that comes from the events, but there must be a possibility to get the fields correctly to EL.
I already tried to use the winlogbeat_template on my ea index but had no success.
I also played around with:
json {
source => "message"
}
in the logstash cfg, but i'm just not getting this thing parsed by EL
Can someone help me out?
Thanks in advance and excuse my english