[on windows] I have a log file that can contain multiple lines.
The filebeat yml is correctly configured for multi line
The source appearing in elastic contains "\n"
e.g. source file content "
ababasd
asfjklasjdflkjas
asljdflkajs]"
will appear in the source as
"ababasd\nasfjklasjdflkjas\nasljdflkajs]"
and in kibana would appear as
"ababasd"
The ingest pipeline parses the source as greedy data
e.g
"%{GREEDYDATA:msg}"
The field msg then appears within kibana truncated at the first "\n" and not the full message with newlines
The exact same yml was used previously in a very early version of filebeat (1.1) and ELK (i.e. logstash) and worked fine
It would appear that FileBeat is encoding the newline in the log file as "\n" for json
but then not decoding it when parsing it on elastic pipeline
is this supposed to work
or only with LogStash
note i have tried to use a painless script in the pipeline but painless/pipeline and backslashs just do no work very well
e.g. this fails and any alteratives (for example using "\")
"source": "ctx.msg = ctx.msg.replace('\\', 'anything)"
no point in supplying exact full logs and yml as have reproduced this using console _simulate with very basic details as above
POST /_ingest/pipeline/_simulate
{
"pipeline": {
"description": "pipeline to test newline",
"processors": [
{
"grok" : {
"field" : "message",
"ignore_missing" : true,
"patterns" : [
"%{TIMESTAMP_ISO8601:timestamp} %{GREEDYDATA:msg}"
]
}
, "script":
{
"lang": "painless",
"inline": "ctx.msg = ctx.msg.replace('\\', 'anything')"
}
}
]
}
, "docs:" [
{
"_source" : { "message" : """2020-04-23 11:33:00.0000 TEST message\nnextline""" }
{
]
}