Conditionals in Grok filter for more than one log type

I am using grok filter for parsing my log file by logstash. I am not getting any error, stll not getting expected results in kibana. my config file looks like below:
My config file is:
filter {
if [type] == "sdp" {

if "Outbound Message" in [message] {
grok {
match => { "message" => "%{TIMESTAMP_ISO8601:logdate} | %{LOGLEVEL:level} | %{DATA:transaction-id} | %{DATA:server-name} | %{DATA:bundle-name} | %{DATA:workqueue} | %{DATA:handler} | %{DATA:service-name} | %{DATA:api-name} | %{DATA:application-id} | %{DATA:system-id} | %{DATA:username} | %{TIMESTAMP_ISO8601:consumer-ref-timestamp} | %{DATA:consumer-ref-id} | %{DATA:csr-id} | %{DATA:user-id} | %{DATA:language-code} | %{DATA:country-code} | %{DATA:env-type} ---> %{DATA:outbound-msg}"}
}

}

else if "Inbound Message" in [message] {
grok {
match => { "message" => "%{TIMESTAMP_ISO8601:logdate} | %{LOGLEVEL:level} | %{DATA:transaction-id} | %{DATA:server-name} | %{DATA:bundle-name} | %{DATA:workqueue} | %{DATA:handler} | %{DATA:service-name} | %{DATA:api-name} | %{DATA:application-id} | %{DATA:system-id} | %{DATA:username} | %{TIMESTAMP_ISO8601:consumer-ref-timestamp} | %{DATA:consumer-ref-id} | %{DATA:csr-id} | %{DATA:user-id} | %{DATA:language-code} | %{DATA:country-code} | %{DATA:inbound-msg} ---> %{DATA:env-type}(%{GREEDYDATA:success-msg} : %{NUMBER:response-time}"}
}
}

else {
grok {
match => {"message" => "%{TIMESTAMP_ISO8601:logdate} | %{LOGLEVEL:level} | %{GREEDYDATA:logdata}"}
}
}

}


Sample log data:
2017-09-25 11:00:33,036 | INFO | 76b1b5ff-6d91-401a-9139-623f7343ab60 | FuseSingTelPRD02-1 | sdp-tc-manage-usage-provider-enabler | qtp1190845090-59767 | oggingOutInterceptor | {http://example/v1}ManageUsage | "searchAllowanceForCustomer" | IMP | IMP | | 2017-09-25T11:00:32.974+08:00 | 63090d35-7fe3-465c-b929-d041b7892bf7 | admin | admin2 | eng | SG | PE ---> Outbound Message
2017-09-25 11:00:33,204 | INFO | 76b1b5ff-6d91-401a-9139-623f7343ab60 | FuseSingTelPRD02-1 | sdp-tc-manage-usage-provider-enabler | default-workqueue-3 | LoggingInInterceptor | {http://example/v1/v1}ManageUsage | "searchAllowanceForCustomer" | IMP | IMP | | 2017-09-25T11:00:32.974+08:00 | 63090d35-7fe3-465c-b929-d041b7892bf7 | admin | admin2 | eng | SG | Inbound Message ---> PE(HTTP 200 - Success) and took : 168
2017-09-25 11:00:34,469 | INFO | 2df08825-ab01-4449-8590-d008c9453b48 | FuseSingTelPRD02-1 | sdp-tc-manage-usage-provider-enabler | qtp1190845090-59138 | oggingOutInterceptor | {http://example/v1/v1}ManageUsage | "searchAllowanceForCustomer" | IMP | IMP | | 2017-09-25T11:00:34.446+08:00 | d46b096e-af78-4dc6-b0e3-582409cd1282 | admin | admin2 | eng | SG | PE ---> Outbound Message
2017-09-25 11:00:34,549 | INFO | 2df08825-ab01-4449-8590-d008c9453b48 | FuseSingTelPRD02-1 | sdp-tc-manage-usage-provider-enabler | default-workqueue-4 | LoggingInInterceptor | {http://example/v1/v1}ManageUsage | "searchAllowanceForCustomer" | IMP | IMP | | 2017-09-25T11:00:34.446+08:00 | d46b096e-af78-4dc6-b0e3-582409cd1282 | admin | admin2 | eng | SG | Inbound Message ---> PE(HTTP 200 - Success) and took : 80

Kindly help me out, if something is incorrect with my filter

Use a stdout { codec => rubydebug } output to debug your filters (and show an example here), then focus your attention on get it into ES.

My output looks like below:

output {
elasticsearch {
hosts => "localhost:9200"
manage_template => false
index => "%{[@metadata][beat]}-%{+YYYY.MM.dd}"
document_type => "%{[@metadata][type]}"
}
}

where do I need to add the stdout { codec => rubydebug } exactly?

Anywhere in the output block. Comment out the elasticsearch output for now.

output {
stdout { codec => rubydebug }
}

Will it be ok? and after this do I need to restart elasticsearch, logstash and filebeat?

getting syntax error for line
stdout { codec => rubydebug }

Getting below:
[2017-09-25T11:14:17,161][ERROR][logstash.agent ] Pipeline aborted due to error {:exception=>#<RegexpError: end pattern with unmatched parenthesis: /(?<TIMESTAMP_ISO8601:logdate>(?:(?>\d\d){1,2})-(?:(?:0?[1-9]|1[0-2]))-(?:(?:(?:0[1-9])|(?:[12][0-9])|(?:3[01])|[1-9]))T :?(?:(?:[0-5][0-9]))(?::?(?:(?:(?:[0-5]?[0-9]|60)(?:[:.,][0-9]+)?)))?(?:(?:Z|+-(?::?(?:(?:[0-5][0-9])))))?) | (?LOGLEVEL:level([Aa]lert|ALERT|[Tt]race|TRACE|[Dd]ebug|DEBUG|[Nn]otice|NOTICE|[Ii]nfo|INFO|[Ww]arn?(?:ing)?|WARN?(?:ING)?|[Ee]rr?(?:or)?|ERR?(?:OR)?|[Cc]rit?(?:ical)?|CRIT?(?:ICAL)?|[Ff]atal|FATAL|[Ss]evere|SEVERE|EMERG(?:ENCY)?|[Ee]merg(?:ency)?)) | (?<DATA:transaction-id>.?) | (?<DATA:server-name>.?) | (?<DATA:bundle-name>.?) | (?<DATA:workqueue>.?) | (?<DATA:handler>.?) | (?<DATA:service-name>.?) | (?<DATA:api-name>.?) | (?<DATA:application-id>.?) | (?<DATA:system-id>.?) | (?<DATA:username>.?) | (?<TIMESTAMP_ISO8601:consumer-ref-timestamp>(?:(?>\d\d){1,2})-(?:(?:0?[1-9]|1[0-2]))-(?:(?:(?:0[1-9])|(?:[12][0-9])|(?:3[01])|[1-9]))T :?(?:(?:[0-5][0-9]))(?::?(?:(?:(?:[0-5]?[0-9]|60)(?:[:.,][0-9]+)?)))?(?:(?:Z|+-(?::?(?:(?:[0-5][0-9])))))?) | (?<DATA:consumer-ref-id>.?) | (?<DATA:csr-id>.?) | (?<DATA:user-id>.?) | (?<DATA:language-code>.?) | (?<DATA:country-code>.?) | (?<DATA:inbound-msg>.?) ---> (?<DATA:env-type>.?)((?GREEDYDATA:success-msg.) : (?NUMBER:response-time(?:(?:(?<![0-9.+-])(?>[+-]?(?:(?:[0-9]+(?:.[0-9]+)?)|(?:.[0-9]+))))))/m>, :backtrace=>["org/jruby/RubyRegexp.java:1434:in initialize'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/jls-grok-0.11.4/lib/grok-pure.rb:127:incompile'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-filter-grok-3.3.0/lib/logstash/filters/grok.rb:272:in register'", "org/jruby/RubyArray.java:1613:ineach'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-filter-grok-3.3.0/lib/logstash/filters/grok.rb:267:in register'", "org/jruby/RubyHash.java:1342:ineach'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-filter-grok-3.3.0/lib/logstash/filters/grok.rb:262:in register'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:230:instart_workers'", "org/jruby/RubyArray.java:1613:in each'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:230:instart_workers'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:183:in run'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:292:instart_pipeline'"]}
[2017-09-25T11:14:17,260][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
[2017-09-25T11:14:20,186][WARN ][logstash.agent ] stopping pipeline {:id=>"main"}

That's an error with your regular expression.

Will it be ok? and after this do I need to restart elasticsearch, logstash and filebeat?

Yes, that's fine. Since you're modifying your Logstash configuration you only need to restart Logstash.

For log entry like "Inbound Message ---> PE(HTTP 200 - Success) and took : 44", do I need to provide "SPACE" Syntax in grok filter? If yes, how do we need to use it?

I don't understand that question.

My question was, so I need to use "SPACE" syntax for white spaces in my patterns. However, it got the solution. The spaces needs to be substituted as period (.), so below one is correct:

match => {"message" => %{TIMESTAMP_ISO8601:timestamp} | %{DATA:api-status} | %{DATA:transaction-id} | %{DATA:server-name} | %{DATA:bundle-name} | %{DATA:workqueue} | %{DATA:handler} | %{DATA:service-name} | %{DATA:api-name} | %{DATA:application-id} | %{DATA:system-id} | %{DATA:username} | %{TIMESTAMP_ISO8601:consumer-ref-timestamp} | %{DATA:consumer-ref-id} | %{DATA:csr-id} | %{DATA:user-id} | %{DATA:language-code} | %{DATA:country-code} | %{DATA:in}.%{DATA:msg}.--->.%{DATA:id}(%{DATA:http}.%{NUMBER:code}.-.%{DATA:state}).%{DATA:and}.%{DATA:took}.:.%{NUMBER:response}" }
for the log pattern:
2017-09-25 11:00:33,204 | INFO | 76b1b5ff-6d91-401a-9139-623f7343ab60 | FusePRD02-1 | sdp-tc-manage-usage-provider-enabler | default-workqueue-3 | LoggingInInterceptor | {http://example/v1/v1}ManageUsage | "searchAllowanceForCustomer" | IMP | IMP | | 2017-09-25T11:00:32.974+08:00 | 63090d35-7fe3-465c-b929-d041b7892bf7 | admin | admin2 | eng | SG | Inbound Message ---> PE(HTTP 200 - Success) and took : 168
My IF condition also worked, after using the above grok filter. Thank You!

Now I am using two types of logs, and filters are different for both of them. However, I can see only 1 type of togs in kibana.
Config file is as follows:
filter {

if [type] == "provider" {

 if "Outbound" in [message] {
grok {
   match => { "message" => "%{TIMESTAMP_ISO8601:timestamp} \| %{DATA:api-status} \| %{DATA:transaction-id} \| %{DATA:server-name} \| %{DATA:bundle-name} \| %{DATA:workqueue} \| %{DATA:handler} \| %{DATA:service-name} \| %{DATA:api-name} \| %{DATA:application-id} \| %{DATA:system-id} \| %{DATA:username} \| %{TIMESTAMP_ISO8601:consumer-ref-timestamp} \| %{DATA:consumer-ref-id} \| %{DATA:csr-id} \| %{DATA:user-id} \| %{DATA:language-code} \| %{DATA:country-code} \| %{GREEDYDATA:outbound-msg}"}

    }

}

else if "Inbound" in [message] {
grok {
match => { "message" => "%{TIMESTAMP_ISO8601:timestamp} | %{DATA:api-status} | %{DATA:transaction-id} | %{DATA:server-name} | %{DATA:bundle-name} | %{DATA:workqueue} | %{DATA:handler} | %{DATA:service-name} | %{DATA:api-name} | %{DATA:application-id} | %{DATA:system-id} | %{DATA:username} | %{TIMESTAMP_ISO8601:consumer-ref-timestamp} | %{DATA:consumer-ref-id} | %{DATA:csr-id} | %{DATA:user-id} | %{DATA:language-code} | %{DATA:country-code} | %{DATA:in}.%{DATA:msg}.--->.%{DATA:id}(%{DATA:http}.%{POSINT:code}.-.%{DATA:state}).%{DATA:and}.%{DATA:took}.:.%{POSINT:response}"}
}
}

else {
grok {
match => {"message" => "%{TIMESTAMP_ISO8601:timestamp} | %{DATA:api-status} | %{DATA:transaction-id} | %{DATA:server-name} | %{DATA:bundle-name} | %{DATA:workqueue} | %{DATA:handler} | %{DATA:service-name} | %{DATA:api-name} | %{DATA:application-id} | %{DATA:system-id} | %{DATA:username} | %{TIMESTAMP_ISO8601:consumer-ref-timestamp} | %{DATA:consumer-ref-id} | %{DATA:csr-id} | %{DATA:user-id} | %{DATA:language-code} | %{DATA:country-code} | %{GREEDYDATA:logdata}"}
}
}

}

if [type] == "service" {

 if "Outbound" in [message] {
grok {
   match => { "message" => "%{TIMESTAMP_ISO8601:timestamp} \| %{DATA:api-status} \| %{DATA:transaction-id} \| %{DATA:server-name} \| %{DATA:bundle-name} \| %{DATA:workqueue} \| %{DATA:handler} \| %{DATA:service-name} \| %{DATA:api-name} \| %{DATA:application-id} \| %{DATA:system-id} \| %{DATA:username} \| %{TIMESTAMP_ISO8601:consumer-ref-timestamp} \| %{DATA:consumer-ref-id} \| %{DATA:csr-id} \| %{DATA:user-id} \| %{DATA:language-code} \| %{DATA:country-code} \| %{DATA:code}.\-\-\-\>.%{DATA:out}.%{DATA:msg}.\:.%{POSINT:response}"}

    }

}

else if "Inbound" in [message] {
grok {
match => { "message" => "%{TIMESTAMP_ISO8601:timestamp} | %{DATA:api-status} | %{DATA:transaction-id} | %{DATA:server-name} | %{DATA:bundle-name} | %{DATA:workqueue} | %{DATA:handler} | %{DATA:service-name} | %{DATA:api-name} | %{DATA:application-id} | %{DATA:system-id} | %{DATA:username} | %{TIMESTAMP_ISO8601:consumer-ref-timestamp} | %{DATA:consumer-ref-id} | %{DATA:csr-id} | %{DATA:user-id} | %{DATA:language-code} | %{DATA:country-code} | %{GREEDYDATA:inbound-msg}"}
}
}

else {
grok {
match => {"message" => "%{TIMESTAMP_ISO8601:timestamp} | %{DATA:api-status} | %{DATA:transaction-id} | %{DATA:server-name} | %{DATA:bundle-name} | %{DATA:workqueue} | %{DATA:handler} | %{DATA:service-name} | %{DATA:api-name} | %{DATA:application-id} | %{DATA:system-id} | %{DATA:username} | %{TIMESTAMP_ISO8601:consumer-ref-timestamp} | %{DATA:consumer-ref-id} | %{DATA:csr-id} | %{DATA:user-id} | %{DATA:language-code} | %{DATA:country-code} | %{GREEDYDATA:logdata}"}
}
}

}

}

I can see only the "service" logs in UI and "provider" ones are not there.

Filebeat.yml looks like below:

  • input_type: log

    Paths that should be crawled and fetched. Glob based paths.

    paths:

    • /var/log/logi/provider.log
      document_type: provider

    paths:

    • /var/log/logi/service.log
      document_type: service

Is anything wrong with the filebeat.yml or the config.
PS: I have already validated the patterns in debugger.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.