Filter does not seem to supply logs for kibana

I have an IIS filter that seems to test fine using the grok debugger however I am not seeing any messages indexed in Kibana via filebeat. Here is my beats.conf

port => "5043"

filter {
if [type] == "syslog" {
grok {
match => { "message" => "%{SYSLOGTIMESTAMP:syslog_timestamp} %{SYSLOGHOST:syslog_hostname} %{DATA:syslog_program}(?:[%{POSINT:syslog_pid}])?: %{GREEDYDATA:syslog_message}" }
date {
match => [ "syslog_timestamp", "MMM d HH:mm:ss", "MMM dd HH:mm:ss" ]

if [type] == "iis_log" {
    if [message] =~ "^#" {
        drop {}

    grok {
        match => { "message" => [ "%{TIMESTAMP_ISO8601:log_timestamp} %{WORD:iisSite} %{IPORHOST:site} %{WORD:method} %{URIPATH:page} %{NOTSPACE:querystring} %{NUMBER:port} %{NOTSPACE:username} %{IPORHOST:clienthost} %{NOTSPACE:useragent} %{NOTSPACE:referer}" ] }



elasticsearch {
hosts => [""]
index => "%{[@metadata][beat]}-%{+YYYY.MM.dd}"
document_type => "%{[@metadata][type]}"

Here is an excerpt from the IIS log:

#Fields: date time s-computername s-ip cs-method cs-uri-stem cs-uri-query s-port cs-username c-ip cs(User-Agent) cs(Referer) cs-host sc-status sc-substatus sc-win32-status sc-bytes cs-bytes time-taken
2018-02-28 08:33:38 host123 GET /some/internal/directory - 443 product/version+build;+iOS+version)+product/1.0 - 123 0 0 233 123 4321

The grok debugger tells me that some of my fields are matching what IIS logs are supplying.

Here is the filebeat.yml


  • type: log

    enabled: false


    • e:\Web Logs\W3SVC**.log
      document_type: iis_log

path: ${path.config}/modules.d/*.yml

reload.enabled: false

index.number_of_shards: 3

tags: ["location", "web Farm"]

environment: production


host: ""

hosts: [""]

Have you checked the Logstash logs for problems? Have you checked if the data is added to a different index than what you expect (use ES's cat indices API)?

Are you sure the type is iis_log? you could add a rubydebug to your output and check. you can also set document_type: in you iis_log prospector, that might help your problem.

When I restart logstash I see the following warnings:

[2018-03-01T08:26:49,961][WARN ][logstash.outputs.elasticsearch] You are using a deprecated config setting "document_type" set in elasticsearch. Deprecated settings will continue to work, but are scheduled for removal from logstash in the future. Document types are being deprecated in Elasticsearch 6.0, and removed entirely in 7.0. You should avoid this feature If you have any questions about this, please visit the #logstash channel on freenode irc. {:name=>"document_type", :plugin=><LogStash::Outputs::ElasticSearch hosts=>[//], index=>"%{[@metadata][beat]}-%{+YYYY.MM.dd}", document_type=>"%{[@metadata][type]}", id=>"97fff7e442252ff4e458bd02f090f79a6de076c520ff83a8a0ce90b047e97b00", enable_metric=>true, codec=><LogStash::Codecs::Plain id=>"plain_6d1711a2-34a5-4d91-bfe9-8422f64f84f9", enable_metric=>true, charset=>"UTF-8">, workers=>1, manage_template=>true, template_name=>"logstash", template_overwrite=>false, doc_as_upsert=>false, script_type=>"inline", script_lang=>"painless", script_var_name=>"event", scripted_upsert=>false, retry_initial_interval=>2, retry_max_interval=>64, retry_on_conflict=>1, action=>"index", ssl_certificate_verification=>true, sniffing=>false, sniffing_delay=>5, timeout=>60, pool_max=>1000, pool_max_per_route=>100, resurrect_delay=>5, validate_after_inactivity=>10000, http_compression=>false>}

[2018-03-01T08:26:50,497][WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: the type event field won't be used to determine the document _type {:es_version=>6}

in my filebeat.yml I have "document_type: iis_log"

I have refreshed the field list for all my indices.

OK I feel silly. I suppose if I set enabled to true it might just work...


•type: log

enabled: false

(2 min later...) what do you know it works. thanks for the help!

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.