Tag ["_parsegrokfailure"] anyone could help me?

Hello, I created a grok pattern and it works well on grok debugger but when i add to filter inside the pipeline it pass the restart phase for the service but on kibana it give this tag ["_grokparsefailure"] i tried to debug it but couldn't reach any solution here is l log sample which i created the grok for:

<14> Server MTM: 8871AC1 Alert Text: Login ID: USERID from webguis at IP address 10.0.6.87 has logged off. Type of Alert: System - Remote Login Severity: 4 Date(m/d/y): 07/14/2020 Time(h:m:s): 21:57:48 Contact: it@sumerge.com Location: Cairo IMM Text ID: Lenovo System x3650 M5 IMM Serial Number: J335FWW IMM UUID: 11797DCA56B611E79CC00894EF47086E Event ID: 4000009d00000000 Serviceable Event Indicator: Not Serviceable FRU list: Not available Room ID: Not available Rack ID: Not available Lowest U-position: 0 Blade Bay: Not available Test Alert: no Auxiliary Data: Not available Common Event ID: Not available Event Type: 0 Report Chain: Not available

and here is the grok pattern that works well on grok debugger:

%{SYSLOG5424PRI}%{SPACE}Server MTM:%{DATA:server_mtm} Alert Text: %{DATA:message} Type of Alert: %{DATA:alert_type} Severity: %{DATA:severity} Date\(m\/d\/y\): %{DATE:date} Time\(h\:m\:s\): %{DATA:time} Contact: %{DATA:contact} Location: %{DATA:location} Text ID: %{DATA:text_id} Serial Number: %{DATA:serial_number} IMM UUID: %{DATA:imm_uuid} Event ID: %{DATA:event_id}%{SPACE} %{GREEDYDATA:data}

and here is my pipeline itself:

input {
  udp {
    port => 5516
    type => syslog
 }
}

filter {
  grok {
    match => {
       "message" => [ "%{SYSLOG5424PRI}%{SPACE}Server MTM:%{DATA:server_mtm} Alert Text: %{DATA:message} Type of Alert: %{DATA:alert_type} Severity: %{DATA:severity} Date\(m\/d\/y\): %{DATE:date} Time\(h\:m\:s\): %{DATA:time} Contact: %{DATA:contact} Location: %{DATA:location} Text ID: %{DATA:textid} Serial Number: %{DATA:serial_number} IMM UUID: %{DATA:imm_uuid} Event ID: %{DATA:event_id}%{SPACE} %{GREEDYDATA:data}" ]
    }
    overwrite => [ "message" ]
    add_field => [ "received_at", "%{@timestamp}" ]
    add_field => [ "received_from", "%{host}" ]
    remove_field => [ "syslog5424_pri", "data" ]
  }
}

output {
  elasticsearch {
    hosts => ["http://10.0.200.120:9200"]
    index => "lenovo-x3650-%{+YYYY.MM.dd}"
  }
}

here is a screenshot from kibana showing the log itself and the error that appears:

if anyone could help me i'd be greatful

Thank you

When I run

input { generator { count => 1 lines => [ '<14> Server MTM: 8871AC1 Alert Text: Login ID: USERID from webguis at IP address 10.0.6.87 has logged off. Type of Alert: System - Remote Login Severity: 4 Date(m/d/y): 07/14/2020 Time(h:m:s): 21:57:48 Contact: it@sumerge.com Location: Cairo IMM Text ID: Lenovo System x3650 M5 IMM Serial Number: J335FWW IMM UUID: 11797DCA56B611E79CC00894EF47086E Event ID: 4000009d00000000 Serviceable Event Indicator: Not Serviceable FRU list: Not available Room ID: Not available Rack ID: Not available Lowest U-position: 0 Blade Bay: Not available Test Alert: no Auxiliary Data: Not available Common Event ID: Not available Event Type: 0 Report Chain: Not available' ] } }
filter {
    grok {
        match => { "message" => [ "%{SYSLOG5424PRI}%{SPACE}Server MTM:%{DATA:server_mtm} Alert Text: %{DATA:message} Type of Alert: %{DATA:alert_type} Severity: %{DATA:severity} Date\(m\/d\/y\): %{DATE:date} Time\(h\:m\:s\): %{DATA:time} Contact: %{DATA:contact} Location: %{DATA:location} Text ID: %{DATA:textid} Serial Number: %{DATA:serial_number} IMM UUID: %{DATA:imm_uuid} Event ID: %{DATA:event_id}%{SPACE} %{GREEDYDATA:data}" ] }
        overwrite => [ "message" ]
    }
}
output  { stdout { codec => rubydebug { metadata => false } } }

it parses just fine, so something is not the way you are saying it is...

      "severity" => "4",
      "location" => "Cairo IMM",
    "server_mtm" => " 8871AC1",
        "textid" => "Lenovo System x3650 M5 IMM",
    "alert_type" => "System - Remote Login",

etc.

what do you mean ?

I mean that the grok filter you show successfully parses the sample data you provided. So either you are using a different grok filter, or the _grokparsefailure is occuring for a different data format.

after adding this line to pipeline i got to see the actual output on screen

  stdout { codec => rubydebug }

the log line being parsed is very different from the one that appears to me in kibana which obviously show why _grokparsefailure appear

<14>\tServer MTM: 8871AC1\n\n\n\tAlert Text: Login ID: USERID from webguis at IP address 10.0.6.87 has logged off.\n\tType of Alert: System - Remote Login\n\n\tSeverity: 4\n\tDate(m/d/y): 07/15/2020\n\tTime(h:m:s): 14:19:28\n\n\tContact: it@sumerge.com\n\n\tLocation: Cairo\n\tIMM Text ID: Lenovo System x3650 M5\n\tIMM Serial Number: J335FWW\n\tIMM UUID: 11797DCA56B611E79CC00894EF47086E\n\tEvent ID: 4000009d00000000\n\tServiceable Event Indicator: Not Serviceable\n\tFRU list: Not available\n\tRoom ID: Not available\n\tRack ID: Not available\n\tLowest U-position: 0\n\tBlade Bay: Not available\n\tTest Alert: no\n\tAuxiliary Data: Not available\n\tCommon Event ID: Not available\n\tEvent Type: 0\n\tReport Chain: Not available

could you please tell me how to parse \n and \t to change the log to the format i used for grok ?

fixed the problem by mutating the log itself before parsing it and the pipeline is as follow:

input {
  udp {
    port => 5516
    type => syslog
 }
}

filter {
  mutate {
    gsub => [ "message", "\n\n\t", " " ]
    gsub => [ "message", "\t", " " ]
    gsub => [ "message", "\n", "" ]
  }
  grok {
    match => {
       "message" => [ "%{SYSLOG5424PRI}%{SPACE}Server MTM:%{DATA:server_mtm} Alert Text: %{DATA:message} Type of Alert: %{DATA:alert_type} Severity: %{NUMBER:severity} Date\(m\/d\/y\): %{DATE:date} Time\(h\:m\:s\): %{TIME:time} Contact: %{DATA:contact} Location: %{DATA:location} IMM Text ID: %{DATA:imm_text_id} Serial Number: %{DATA:serial_number} IMM UUID: %{DATA:imm_uuid} Event ID: %{DATA:event_id}%{SPACE} %{GREEDYDATA:data}" ]
    }
    overwrite => [ "message" ]
    add_field => [ "received_at", "%{@timestamp}" ]
    add_field => [ "received_from", "%{host}" ]
    remove_field => [ "syslog5424_pri", "data" ]
  }
}

output {
  elasticsearch {
    hosts => ["http://10.0.200.120:9200"]
    index => "lenovo-x3650-%{+YYYY.MM.dd}"
  }
}
1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.