Log not show in kibana after filter in logstash

Hi all,

I'm using filter for parsing docker log of python app. but after use grok for matching fields log no longer show in kibana dashboard.

if [type] == "log" {
grok {
match => {
"log" => "%{TIMESTAMP_ISO8601:timestamp},%{NOTSPACE} %{LOGLEVEL:Level} | %{DATA:tx_id}: %{DATA:txid} | %{DATA:client_id}: %{DATA:clientid} | [%{DATA:proc_name}] - %{GREEDYDATA:message}"
}
break_on_match => false
remove_tag => [ "_grokparsefailure" ]
add_field => {
"logtype" => "dockerInfo"
}
}
grok {
match => {
"log" => "%{TIMESTAMP_ISO8601:timestamp},%{NOTSPACE} %{LOGLEVEL:Level} | %{DATA:tx_id}: %{DATA:txid} | %{DATA:client_id}: %{DATA:clientid} | [%{DATA:proc_name}] - %{GREEDYDATA:message}"
}
break_on_match => false
remove_tag => [ "_grokparsefailure" ]
add_field => {
"logtype" => "dockerErr"
}
}

date {
"match" => [ "timestamp", "YYYY-MM-dd HH:mm:ss,SSS" ]
remove_field => [ "timestamp" ]
target => "@timestamp"
}

}

sample data :slight_smile:

2018-08-24 22:24:24,002,2 ERROR | tx_id: ofgjbqbnz7 | client_id: None | [authorization_api.py:123] - Unable to connect to OAuth API: 401 Unauthorized: The server could not verify that you are authorized to access the URL requested. You either supplied the wrong credentials (e.g. a bad password), or your browser doesn't understand how to supply the credentials required.
2018-08-24 22:24:30,954,954 INFO | tx_id: 0hfzak0eff | client_id: None | [common_functions.py:136] - Retrieving oauth2_client_data

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.