I am trying to parse the log below through Logstash, however I can't seem to get the pattern to work. Essentially, I need to have the line straight after the dated line begining with {0=Failed to be part of the same multiline event as the dated event above along with all the java output that begins with a space below it.. right up until the next dated log line.
This is a sample of the log I am trying to parse through Logstash:
2016-07-28 00:10:04,762 ERROR [qtp11076495-4340] com.pciwarehouse.gamediv.passimodutti.rest.passimoduttiRestImpl [writeDiagnosticMessage:1069] - Failed to post data to store FlameBoyance. Diagnostic data at 16CF/57/99/3F/4C/AC/1E/16CF-57993F4C-AC1E9A98-DCEA-AC1EA8EB-1F92-41BD0E. Failure for entries at position [0]
{0=Failed to post data to store FlameBoyance - com.google.common.util.concurrent.UncheckedExecutionException: Portable(com.tangosol.util.WrapperException): (Wrapped: Failed request execution for FilterPartitionedPofCache service on Member(Id=4, Timestamp=2016-06-28 16:00:15.747, Address=172.30.168.235:10008, MachineId=28623, Location=site:KWSDev,machine:grdplo1001,process:2723,member:n4-grdplo1001.pciwarehouse.local, Role=pciwarehousegamedivpassimoduttiCacheServer)) null
at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.PartitionedService.onNotify(PartitionedService.CDB:3)
at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.partitionedService.PartitionedCache.onNotify(PartitionedCache.CDB:3)
at com.tangosol.coherence.component.util.Daemon.run(Daemon.CDB:51)
... 1 more
Caused by: Portable(java.lang.UnsupportedOperationException)
at com.tangosol.io.pof.ThrowablePofSerializer.deserialize(ThrowablePofSerializer.java:57)
at com.tangosol.io.pof.PofBufferReader.readAsObject(PofBufferReader.java:3316)
at com.tangosol.io.pof.PofBufferReader.readObject(PofBufferReader.java:2604)
at com.tangosol.io.pof.PortableException.readExternal(PortableException.java:150)
at com.tangosol.io.pof.ThrowablePofSerializer.deserialize(ThrowablePofSerializer.java:59)
... 12 more
}
2016-07-28 00:10:04,490 INFO [qtp11076495-4340] com.pciwarehouse.gamediv.passimodutti.rest.resources.RestResource [time_functor:172] - http://hollana.htu.local:8082/chutneymarey/FindingStrings
And this is the filter I have in place:
filter {
grok {
match => { "message" => "\A%{TIMESTAMP_ISO8601:timestamp}%{SPACE}%{LOGLEVEL:loglevel}%{SPACE}%{SYSLOG5424PRINTASCII:thread}%{SPACE}%{JAVACLASS:logger}%{SPACE}\[%{JAVAMETHOD:method}:%{NUMBER:line}]%{SPACE}-%{SPACE}%{GREEDYDATA:message}" }
}
date {
match => [ "timestamp", "MMM dd YYY HH:mm:ss", "MMM d YYY HH:mm:ss", "ISO8601" ]
remove_field => [ "timestamp" ]
}
multiline {
pattern => "^\{"
what => "previous"
negate=> true
}
It's looking much better than it did before, though I'm still getting a _grokparsefailure
"message" => "2016-07-28 00:10:04,762 ERROR [qtp11076495-4340] com.pciwarehouse.gamediv.passimodutti.rest.passimoduttiRestImpl [writeDiagnosticMessage:1069] - Failed to post data to store FlameBoyance. Diagnostic data at 16CF/57/99/3F/4C/AC/1E/16CF-57993F4C-AC1E9A98-DCEA-AC1EA8EB-1F92-41BD0E. Failure for entries at position [0]\nFailed to post data to store FlameBoyance. Diagnostic data at 16CF/57/99/3F/4C/AC/1E/16CF-57993F4C-AC1E9A98-DCEA-AC1EA8EB-1F92-41BD0E. Failure for entries at position [0]\n{0=Failed to post data to store FlameBoyance - com.google.common.util.concurrent.UncheckedExecutionException: Portable(com.tangosol.util.WrapperException): (Wrapped: Failed request execution for FilterPartitionedPofCache service on Member(Id=4, Timestamp=2016-06-28 16:00:15.747, Address=172.30.168.235:10008, MachineId=28623, Location=site:KWSDev,machine:grdplo1001,process:2723,member:n4-grdplo1001.pciwarehouse.local, Role=pciwarehousegamedivpassimoduttiCacheServer)) null\n at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.PartitionedService.onNotify(PartitionedService.CDB:3)\n at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.partitionedService.PartitionedCache.onNotify(PartitionedCache.CDB:3)\n at com.tangosol.coherence.component.util.Daemon.run(Daemon.CDB:51)\n ... 1 more\nCaused by: Portable(java.lang.UnsupportedOperationException)\n at com.tangosol.io.pof.ThrowablePofSerializer.deserialize(ThrowablePofSerializer.java:57)\n at com.tangosol.io.pof.PofBufferReader.readAsObject(PofBufferReader.java:3316)\n at com.tangosol.io.pof.PofBufferReader.readObject(PofBufferReader.java:2604)\n at com.tangosol.io.pof.PortableException.readExternal(PortableException.java:150)\n at com.tangosol.io.pof.ThrowablePofSerializer.deserialize(ThrowablePofSerializer.java:59)\n ... 12 more\n}",
"@version" => "1",
"@timestamp" => "2016-07-27T23:10:04.762Z",
"host" => "grdvls1004.pciwarehouse.local",
"loglevel" => "ERROR",
"thread" => "[qtp11076495-4340]",
"logger" => "com.pciwarehouse.gamediv.passimodutti.rest.passimoduttiRestImpl",
"method" => "writeDiagnosticMessage",
"line" => "1069",
"tags" => [
[0] "_grokparsefailure",
[1] "multiline"
]
}
{
"message" => "2016-07-28 00:10:04,490 INFO [qtp11076495-4340] com.pciwarehouse.gamediv.passimodutti.rest.resources.RestResource [time_functor:172] - http://hollana.htu.local:8082/chutneymarey/FindingStrings\nhttp://hollana.htu.local:8082/chutneymarey/FindingStrings",
"@version" => "1",
"@timestamp" => "2016-07-27T23:10:04.490Z",
"host" => "grdvls1004.pciwarehouse.local",
"loglevel" => "INFO",
"thread" => "[qtp11076495-4340]",
"logger" => "com.pciwarehouse.gamediv.passimodutti.rest.resources.RestResource",
"method" => "time_functor",
"line" => "172"
}
I suspect you have an extra grok filter somewhere, because the grok filter you're showing above is obviously successful. Are there any unexpected files (e.g. backup files) in /etc/logstash/conf.d or wherever you store your configuration files?
Putting the multiline filter at the beginning did the trick !! Thank you !!
Thanks - I'll now switch to the multiline codec. Regarding the multiline codec - does it have to be used under the input section of a configuration of a file or can it also be used under the filter section as well?
I receive various application log files of different formats from multiple servers). What is the best way to integrate the multiline codec into the input?
Would something like be suitable? All the log files do generally start with an ISO Timestamp.
It is often beneficial to perform the multiline processing as close to the source as possible, and FileBeat now supports multiline processing. Have you considered moving this logic from Logstash to FileBeat?
Would something like this work? Essentially, telling it what kind of log files need multipath - And not forcing multipath on all the files being sent by logstash.
AFAIK Filebeat's multiline exprsesions don't support Logstash's grok patterns but otherwise it looks okay. I believe the documentation contains details about the regexp flavor that's supported.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.