Parsing multiline logs

I am attempting to index a log file that consist of multilines and it appears one set of multilines is parsing correctly but the other isn't. Its being parsed as a separate log entry.
Sample Log file:

2017-12-12 02:44:35,456 [11-1234567890] [] []  DEBUG RequestLogger Response: SwitchingProtocols 
2017-12-12 02:44:35,456 [(null)] [1234567] [(null)] (null) DEBUG AppName a122d567-1234-5678-9012-aabbcceedd - Command: BeginTransaction - Response: AQAAAasdfasdfasdfasdfasdfasdfasdfasdfsadfasdfEAAABUUzAxLVFBSFlTMDEzAGF0HAAAAFQAUwAwADEALQBRAEEASABZAFMAMAAxADMAAAABAAAAAAAVFBSFlTMDEzLmhlZGdlc2VydnRlc3QuY29tLwAn
2017-12-12 02:44:35,519 [2-12345] [6666666] [] Jump_To_TE DEBUG RequestLogger Request: PUT https://aa02-abcde111.somename.com/api/dummyHis
{"SomeNameName": "Default Primary Some Name", "asdfasdfasdf": "ASDF", "AasdfasdfasdfName": "ASD/ASD 04/21/19", "LISTFactor": 2, "Date": "22200e-00-07T00:00:00", "DICE": 1.345345345}
2017-12-12 02:44:35,581 [3-12313131313131313131] [1231232131] [100] NANME_BLUE_DZ DEBUG RequestLogger Response: OK 
2017-12-12 02:44:35,597 [88-34534534534534534] [1313131331] [] Single_HARD_D_2 INFO  APPNAME HUH with dummy id : 2344 inserted with status : Success ( Message: ) for WWClientId: 20
2017-12-12 02:44:37,586 [0-asdfasdfasdfasdfasdfsa] [] []  DEBUG RequestLogger Response: SwitchingProtocols 
2017-12-12 02:44:37,586 [(null)] [345345345] [(null)] (null) DEBUG Jumpk 124123123-3434-3434-3434-3434343434 - Command: BeginDUMMY - Response: ASDFASDFASDFSADFSADFSADFSADFSADFSADFSADFASDFSAFSADFASDFSADFSADFSADFASFASDFASDFSADFSADFSADFASDFASDFSADFASDFSADFASDFSADFSADFSADFSDFAGSADFASDFSADFASDFSADFASDFASDFASDFASDFASDFASDFTEyM2Y2ZTZiNQAtNDgNAAAAZM1kzSEAAABUUzAxLVFBSFlTMDEzAGF0HAAAAFQAUwAwADEALQBRAEEASABZAFMAMAAxADMAAAABAAAAAAAAACYAAAB0aXA6Ly9UUzAxLVFBSFlTMDEzLmhlZGdlc2VydnRlc3QuY29t
2017-12-12 02:44:37,649 [55-123412341234123412341] [-12312312312] [] JUMP_MAN_AA_BCD_YOOOO_AWESOME_CLEAR_YUP_ASDFASDFA3442 DEBUG RequestLogger Request: POST https://aaa-asfasdf.asfdasdf.com/api/YUP/Batch
[{"RATE": 1.123123123, "JUMPMANType": "HAHAHA", "DUMMYNumber": 1234, "actionNumber": null, "Date": "20023-04-30T00:00:00", "Security": "ABR/LOG/LOG 2003123-04-70 201238-024-aed29", "Name": null, "Date": null, "Name": "located", "Price": null, "unt": null, "Code": false, "ithotor": -000.000000, "Locarate": 1.0klj000000, "Quaity": -10-09-0, "Date": "20-0908-04-30T00:00:00", "TraTreatment": "", "SeeNumber": 15009-0901002, "Date": "20-090908-04-24T00:00:00", "rate": null, "0989ce": 0.8908098800000, "Date": "20-089008-04-24T00:00:00", "ocessed": true, "ercentage": null, "onCode": "or", "yCode": "SD", "Amount": null, "Active": true, "fier": 8908, "rest": false, "Name": "r 2", "TrFes": [], "PorName": "rt", "Amount": null, "Type": ""}]
2017-12-12 02:44:37,774 [33-2234234234234324] [-3123123123] [] JUMP_MAN_TT_TDE_CRIP_in_Blood_AAB_AB12313 INFO  ANAME raid with raid id : 1233 inserted with status : Success ( Message: ) for HsClientId: 20
2017-12-12 02:44:37,774 [123-123123123123123123] [-123123123] [22] HI_ME_UD_TDE_GANG_in_BGANGase_AAC_S123123862 DEBUG RequestLogger Response: OK 

filebeat.yml:

filebeat:

  prospectors:
    -
      paths:
        - /home/name/name/*.log
      input_type: log
      ignore_older: 24h
      multiline.pattern: '^[[:punct:]]'
      multiline.negate: false
      multiline.match: after
      fields:
        client: "name"
        tags: "Q"
        type: "api"
      fields_under_root: true

   registry_file: /var/lib/filebeat/registry
output:
    logstash:
      hosts: ["192.198.222.118:5044"]


logging:

  files:

    rotateeverybytes: 10485760 # = 10MB
  level: debug

grok pattern:

filter {
if [type] == "api" {
grok {
  match => {"message" => "%{TIMESTAMP_ISO8601:Timestamp} \[%{DATA:CID}] \[%{DATA:TID}] \[%{DATA:HClient}]\s(%{DATA:TName}) %{WORD:LogLevel} %{GREEDYDATA:Message}\s\[%{DATA:questBody}]"}
  match => {"message" => "%{TIMESTAMP_ISO8601:Timestamp} \[%{DATA:CID}] \[%{DATA:TID}] \[%{DATA:HClient}]\s(%{DATA:TName}) %{WORD:LogLevel} %{GREEDYDATA:Message}"}
  match => {"message" => "%{TIMESTAMP_ISO8601:timestamp} \[%{DATA:CID}] \[%{DATA:TID}] \[%{DATA:HClient}]\s %{WORD:LogLevel} %{GREEDYDATA:Message}"}
  }
 }
}

The multiline that beings with "{"SomeNameName":" gets parsed correctly but the second multiline that begins with "[RATE:" gets parsed into a separate log entry and not getting the proper file name of questBody from the grok pattern. I've tried several different multiline.pattern options as well as changing the negate to true.

You can set you pattern differently to match the date at the beginning of your log lines.

multiline.pattern: '^[0-9]{4}-[0-9]{2}-[0-9]{2}'
multiline.negate: true
multiline.match: after

Thanks for your reply @kvch

I ended up figuring out my issue. I made changes to the grok pattern and also added kv option. Just in case anyone else has a similar issue, the new grok pattern is below:

  if [type] == "api" {
    grok {
      match => {"message" => "%{TIMESTAMP_ISO8601:Timestamp} \[%{DATA:CID:int}] \[%{DATA:TID:int}] \[%{DATA:HClient}]\s(%{DATA:TName}) %{WORD:LogLevel} %{DATA:Msg}\s\[(?<questBody>(.*))]"}
      match => {"message" => "%{TIMESTAMP_ISO8601:Timestamp} \[%{DATA:CID:int}] \[%{DATA:TID:int}] \[%{DATA:HClient}]\s(%{DATA:TName}) %{WORD:LogLevel} %{GREEDYDATA:Msg}"}
      match => {"message" => "%{TIMESTAMP_ISO8601:Timestamp} \[%{DATA:CID:int}] \[%{DATA:TID:int}] \[%{DATA:HClient}]\s %{WORD:LogLevel} %{GREEDYDATA:Msg}"}
    }
    if [questBody] {
      kv {
        source => "questBody"
        value_split => ":"
        include_brackets => true
        trim_value => ","
        trim_key => "\",\[\]<>\{\}"
        remove_char_key => "\{\}\[\]\""
   }
  }
 }
}

Thanks for sharing it with us!

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.