Multiline txt log files not parsing correctly in ingest pipeline

[on windows] I have a log file that can contain multiple lines.

The filebeat yml is correctly configured for multi line

The source appearing in elastic contains "\n"
e.g. source file content "

will appear in the source as

and in kibana would appear as

The ingest pipeline parses the source as greedy data

The field msg then appears within kibana truncated at the first "\n" and not the full message with newlines

The exact same yml was used previously in a very early version of filebeat (1.1) and ELK (i.e. logstash) and worked fine

It would appear that FileBeat is encoding the newline in the log file as "\n" for json
but then not decoding it when parsing it on elastic pipeline

is this supposed to work
or only with LogStash

note i have tried to use a painless script in the pipeline but painless/pipeline and backslashs just do no work very well
e.g. this fails and any alteratives (for example using "\")

     "source": "ctx.msg = ctx.msg.replace('\\', 'anything)"

no point in supplying exact full logs and yml as have reproduced this using console _simulate with very basic details as above

POST /_ingest/pipeline/_simulate
  "pipeline": {
    "description": "pipeline to test newline",
    "processors": [
        "grok" : {
                "field" : "message",
                        "ignore_missing" : true,
                        "patterns" : [
                                "%{TIMESTAMP_ISO8601:timestamp} %{GREEDYDATA:msg}"
        , "script":
            "lang": "painless",
            "inline": "ctx.msg = ctx.msg.replace('\\', 'anything')"

    , "docs:" [
                    "_source" : { "message" :  """2020-04-23 11:33:00.0000 TEST message\nnextline""" }

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.