Question on filebeat multiline pattern

I am trying to get the logs from a legacy system into Elastic via Filebeat. Needless to say it is the so called "Log from Hell". Since it is a csv I am using grok to great relief.

There is one thing which is bothering me. The exceptions which are also getting dumped. This is one example

"2018/08/28 15:16:35.516","DEBUG","16","318219","Read","Data Fetching","121020","0","KKMR","","","(null)","",
"2018/08/28 15:16:56.464","ERROR","16","339166","Error","Data Fetching","141967","(null)","Failed to write value to JJMR","(null)","(null)","EER102","SMT.Errors.ModuleException: EER102-Failed to write value to UP for SetJTCCommand: R
   at SMT.Device\KalsJen.cs:line 1614
   at SMT.File.cs:line 30
   at SMT.SAMTStep.cs:line 39
   at SMT.Step.Simulator.Execute() in C:\JJSEN\SMT\Tips.cs:line 41
",

This is my filebeat input section for multiline.

   exclude_lines: ['^DATE']
   multiline.pattern: '^\"'
   multiline.negate: true
   multiline.match: after  

I am pushing the last field into exception field as below in grok.

   %{QUOTEDSTRING:exception}

However notice the last line, that naughty little double quote followed by the comma. The exception was dumped with a trailing newline. And I get error since the last double quote is not appended to the exception text.

Is it possible to include both strings not starting with " or starting with value ", both to be appended?

That way I will get an exception text which will be closed in double quotes. RIght now I get the exception string starting with " but not ending with one.

I am trying to create a multiline pattern for that but regex is not my strong point.

@pk.241011 CSV is a bit of a pain to work with so many exceptions with format and encoding, I wonder in that case if the ingest-csv plugin would be a good fit. It's not part of the official ingest node processors but its developed by a colleague.

Thanks for this one. Is it going to be a part of later release? Just wondering if it can keep up with the scale and has it been tested well. This is production enviroment I am dealing with here.

I am not sure about the scaling factor for the above ingest plugins, but are you running Logstash as part of your stack? If yes, Logstash has a csv filter which is part of the official release and will scale.

Thanks. Will try with logstash.

I don't know if the CSV processor will be part of the official release or not, you might want to create an issue on the CSV repository.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.