Logstash debugger tool shows matched for filter but new fields don't appear on Kibana

I am having an issue with my logstash and grok filters. When i use logstash debugger tool and pasting log entry and filter to test the parser it checks out OK and gives me an expected output with extra fields that i am looking for. However, when i add this new filter to logstash it starts OK without complaining about configuration but then i don't see my new fields in Kibana that i am looking for. Could you please help me to find out that's wrong? Here is my config files:

1) 10-filters.conf file under /etc/logstash/conf.d/

  filter {
      if [type] == "syslog" {
        grok {
          match => { "message" => "%{SYSLOGTIMESTAMP:syslog_timestamp} %{SYSLOGHOST:syslog_hostname} %{DATA:syslog_program}(?:\[%{POSINT:syslog_pid}\])?: %{GREEDYDATA:syslog_message}" }
          add_field => [ "received_at", "%{@timestamp}" ]
          add_field => [ "received_from", "%{host}" ]
        }
        syslog_pri { }
        date {
          match => [ "syslog_timestamp", "MMM  d HH:mm:ss", "MMM dd HH:mm:ss" ]
        }
      }

  if [type] == "uwsgi_api" {
    grok {
      match => { "message" => "^%{SYSLOGTIMESTAMP:date} \[pid: %{INT:pid}\]: %{LOGLEVEL:loglevel}:%{GREEDYDATA:filepath}:Call( by user %{QUOTEDSTRING:username})? (raised %{WORD:abort}|completed without aborting)" }
      add_field => { "type" => "user_info" }
    }
    grok {
      patterns_dir => ["/etc/logstash/patterns"]
      match => { "message" => "\[.*\] %{IP:ipaddress} \(\) {.*} \[%{DATESTAMP_FLASK:timestamp}\] %{WORD:method} %{URIPATHPARAM:fullpath} => generated %{INT:bytes} bytes in %{INT:ms} msecs \(HTTP/1.1 %{INT:return_code}" }
      add_field => { "type" => "flask_info" }
    }
  }
}  
  1. on the client side i am able to see that it has the right document_type=type assigned to each of the entry that is being sent to ELK server.
    /usr/share/filebeat/bin/filebeat -c /etc/filbeat/filebeat.yam -e -d "*:

       2017/12/05 20:12:18.047741 client.go:214: DBG  Publish: {
       "@timestamp": "2017-12-05T20:12:10.417Z",
       "beat": {
         "hostname": "prod-analytics",
         "name": "prod-analytics",
         "version": "5.6.2"
       },
       "input_type": "log",
       "message": "65.206.3.148 - - [05/Dec/2017:11:17:29 -0500] \"GET /api/v4/projects/demo/pr3tswz4/terms/?terms=%5B%22humor%2C2%7Cen%22%5D HTTP/1.1\" 200 148 \"-\" \"python-requests/2.18.1\"",
       "offset": 18290742,
       "source": "/var/log/nginx/analytics-api-access.log",
       "type": "syslog"
     }
     2017/12/05 20:12:18.047812 client.go:214: DBG  Publish: {
       "@timestamp": "2017-12-05T20:12:10.417Z",
       "beat": {
         "hostname": "prod-analytics",
         "name": "prod-analytics",
         "version": "5.6.2"
       },
       "input_type": "log",
       "message": "Dec 05 15:10:09.628 [pid: 128439]: INFO:lumi_api.routes.utils.framework:Call by user 'lumi-test' completed without aborting",
       "offset": 14035525,
       "source": "/var/log/uwsgi/app/api.log",
       "type": "uwsgi_api"
     }
     2017/12/05 20:12:18.047888 client.go:214: DBG  Publish: {
       "@timestamp": "2017-12-05T20:12:10.417Z",
       "beat": {
         "hostname": "prod-analytics",
         "name": "prod-analytics",
         "version": "5.6.2"
       },
       "input_type": "log",
       "message": "[pid: 128439|app: 0|req: 63/211940] 52.72.8.25 () {38 vars in 680 bytes} [Tue Dec  5 15:10:09 2017] GET /api/v4/projects/lumi-test/prs5dzbw/docs/search/?terms=%5B%22clarity%7Cen%22%5D\u0026limit=20 =\u003e generated 318169 bytes in 52 msecs (HTTP/1.1 200) 2 headers in 75 bytes (3 switches on core 0)",
       "offset": 14035816,
       "source": "/var/log/uwsgi/app/api.log",
       "type": "uwsgi_api"
     }
    

I am really confused why it doesn't work. I also have my patterns file that i am using for the second grok filter:

MONTHDAY_FLASK (?:(?:[0 ][1-9])|(?:[12][0-9])|(?:3[01])|[1-9])
DATESTAMP_FLASK %{DAY} %{MONTH} %{MONTHDAY_FLASK} %{HOUR}:%{MINUTE}:%{SECOND} %{YEAR}

What do the events Logstash produce look like? Use a stdout { codec => rubydebug } output.

Magnus thank you for your response i figured it out. The problem was in the 2nd and the 3rd filters itself. They were overriding type attribute that was set to uwsgi_api that's why it was never applied. I just delete line add_filed (type....) and it worked. However i am seeing a diff issue now: logstash is trying to publish events to closed indexes and becomes stale after some time since events can't be published. I think i will open a new topic for it. Thank you again for response.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.