I have the following pipeline config:
input {
file {
type => "json"
path => [ C:/temp/*.json" ]
start_position => "beginnning"
codec => multiline {
pattern => "^ZsExDrC"
what => "previous"
negate => true
auto_flush_interval => 2
max_lines => 50000
}
}
}
filter {
json {
source => "message"
}
}
output {
elasticsearch {
hosts => [ "https://hotdata1:9200", "https://hotdata2:9200" ]
index => "index-name-%{+YYYY.MM.dd}"
user => "logstash_internal"
password => "xxxxxx"
ssl => true
cacert => "path/to/cert"
}
}
It's ingesting the log into ES, but the fields aren't being parsed and I'm getting the following tags: multiline
and _jsonparsefailure
. I looked in the LS logs (set to debug), but can't find any more details on what is causing the parse failure. I'm sorry I can't easily share the JSON here, but am wondering if there is anything in my pipeline config that is wrong?