Grok filter not being triggered in Logstash server

hey guys!

I have a grok filter in my Logstash server for my IIS logs... but it seams the logs are not being filter by it, and there's no errors in the logstash's logs, what would happen if the problem was in the filter itself.
follow some informations that may help:
my beats.conf:
input {
beats {
port => 5044
type => "log"
} }

    filter {
    if [type] == "log" {
        grok {
         match => [ "message" => "%{TIMESTAMP_ISO8601:log_timestamp} %{WORD:S-SiteName} %{NOTSPACE:S-ComputerName} %{IPORHOST:S-IP} %{WORD:CS-Method} %{URIPATH:CS-URI-Stem} (?:-|\"%{URIPATH:CS-URI-Query}\") %{NUMBER:S-Port} %{NOTSPACE:CS-Username} %{IPORHOST:C-IP} %{NOTSPACE:CS-Version} %{NOTSPACE:CS-UserAgent} %{NOTSPACE:CS-Cookie} %{NOTSPACE:CS-Referer} %{NOTSPACE:CS-Host} %{NUMBER:SC-Status} %{NUMBER:SC-SubStatus} %{NUMBER:SC-Win32-Status} %{NUMBER:SC-Bytes} %{NUMBER:CS-Bytes} %{NUMBER:Time-Taken}"]
    	 }
      }

     


    output {
      elasticsearch {
        hosts => "10.175.142.92:9200"
        manage_template => false
        index => "%{[@metadata][beat]}-%{+YYYY.MM.dd}"
      }
    }

my filebeat.yml

filebeat.inputs:
 - type: log
   enabled: true
 paths:
     - D:\Logfiles\*\*.log
   fields_under_root: true
   fields:
     type: iis
  filebeat.config.modules:
   path: ${path.config}/modules.d/*.yml
 reload.enabled: false
 setup.template.settings:
   index.number_of_shards: 3
   
 output.logstash:
   hosts: ["10.175.142.49:5044"]
 processors:
   - add_host_metadata: ~
   - add_cloud_metadata: ~
 logging.level: debug

and how to log is showing up in Kibana:

@timestamp February 15th 2019, 08:45:13.680
t @version 1
t _id PA-e8WgBKvRFGWguXXr9
t _index filebeat-2019.02.15
# _score -
t _type doc
t beat.hostname my_server
t beat.name my_server
t beat.version 6.5.4
t fileset.module iis
t fileset.name access
t host.architecture x86_64
t host.id e2b6035c-59e1-49c1-be70-dffd00525749
t host.name my_server
t host.os.build 9200.22640
t host.os.family windows
t host.os.platform windows
t host.os.version 6.2
t input.type log
t message 2019-02-15 14:45:10 W3SVC1 my_server10.160.227.166 GET /App_Themes/Basic/CSS/JQDatePicker/jquery.datepicker.css - 80 americas\myaccount 10.175.140.244 HTTP/1.1 Mozilla/5.0+(X11;+Linux+x86_64;+Catchpoint)+AppleWebKit/537.36+(KHTML,+like+Gecko)+Chrome/59.0.3071.115+Safari/537.36 USPLASessionCookie=uspla302c04e69e27f4483db0461fa07daf4d88;+HasAppSupportRole=true http://my_server/uspla302c04e69e27f4483db0461fa07daf4d88/Cart/ExtractQuote?QuoteId=1001695266129_1_0_Q my_server 200 0 0 1473 642 31
# offset 734,192
t prospector.type log
t source D:\Logfiles\W3SVC1\u_ex19021514.log
t tags beats_input_codec_plain_applied
t type log

important information:
in the same logstash port(5044) i'm receiving winlogbeats logs too.

I would be really greatful if someone could help me out here!

The type option on the beats input is ignored, because filebeat sets type itself. I would expect your documents to have type iis (as set by filebeat), although I see the sample in kibana does have type set to log. Not sure what is going on.

just made a quick alteration and now the type is IIS

but the filter still not working

and the filter is working on grok debugger

That configuration is missing a }, are you sure logstash is running it?

where it's missing?

yep... logstash up and running

filter has 3 { but only 2 }

in Logstash logs i can find these two lines, coming from the same Filebeat agent

[2019-02-18T14:09:44,824][DEBUG][logstash.pipeline        ] filter received {"event"=>{"prospector"=>{"type"=>"log"}, "host"=>{"os"=>{"version"=>"6.2", "build"=>"9200.22640", "platform"=>"windows", "family"=>"windows"}, "name"=>"my_server", "architecture"=>"x86_64", "id"=>"e2b6035c-59e1-49c1-be70-dffd00525749"}, "tags"=>["beats_input_codec_plain_applied"], "message"=>"2019-02-18 20:08:51 W3SVC1 my_server 10.160.227.166 GET / - 80 - 10.32.34.5 HTTP/0.9 - - - - 401 2 5 1293 7 0", "beat"=>{"version"=>"6.5.4", "name"=>"my_server", "hostname"=>"my_server"}, "offset"=>112068, "fileset"=>{"module"=>"iis", "name"=>"access"}, "@version"=>"1", "@timestamp"=>2019-02-18T20:09:43.816Z, "source"=>"D:\\Logfiles\\W3SVC1\\u_ex19021820.log", "input"=>{"type"=>"log"}, "type"=>"log"}}


[2019-02-18T14:10:50,960][DEBUG][logstash.pipeline        ] filter received {"event"=>{"prospector"=>{"type"=>"log"}, "host"=>{"os"=>{"version"=>"6.2", "family"=>"windows", "build"=>"9200.22640", "platform"=>"windows"}, "architecture"=>"x86_64", "name"=>"my_server", "id"=>"e2b6035c-59e1-49c1-be70-dffd00525749"}, "tags"=>["beats_input_codec_plain_applied"], "message"=>"2019-02-18 20:09:52 W3SVC1 my_server 10.160.227.166 GET / - 80 - 10.32.34.5 HTTP/0.9 - - - - 401 2 5 1293 7 0", "beat"=>{"version"=>"6.5.4", "name"=>"my_server", "hostname"=>"my_server"}, "offset"=>112538, "@version"=>"1", "@timestamp"=>2019-02-18T20:10:48.854Z, "source"=>"D:\\Logfiles\\W3SVC1\\u_ex19021820.log", "input"=>{"type"=>"log"}, "type"=>"iis"}}

the end of each line intrigues me
"input"=>{"type"=>"log"}, "type"=>"log"}}
"input"=>{"type"=>"log"}, "type"=>"iis"}}

My guess is you have two prospectors configured, one of which sets type to log, and one which sets type to iis.

any idea on how i can fix that?

My guess is that in addition to reading these files

paths:
  - D:\Logfiles\*\*.log

with a prospector in filebeat.yml, you are reading them again with one of the modules from modules.d/*.yml. Note that one of the two events contains

"fileset"=>{"module"=>"iis", "name"=>"access"}

maybe if I disable the IIS.yml module in filebeat?

Yes.

no luck, same scenario

Hi Carlos!

The best way to test logstash is with this:


input {
   generator { <here config>}
}
 

filter {
        grok {
         match => [ "message" => "%{TIMESTAMP_ISO8601:log_timestamp} %{WORD:S-SiteName} %{NOTSPACE:S-ComputerName} %{IPORHOST:S-IP} %{WORD:CS-Method} %{URIPATH:CS-URI-Stem} (?:-|\"%{URIPATH:CS-URI-Query}\") %{NUMBER:S-Port} %{NOTSPACE:CS-Username} %{IPORHOST:C-IP} %{NOTSPACE:CS-Version} %{NOTSPACE:CS-UserAgent} %{NOTSPACE:CS-Cookie} %{NOTSPACE:CS-Referer} %{NOTSPACE:CS-Host} %{NUMBER:SC-Status} %{NUMBER:SC-SubStatus} %{NUMBER:SC-Win32-Status} %{NUMBER:SC-Bytes} %{NUMBER:CS-Bytes} %{NUMBER:Time-Taken}"]

      }
}

output {
   stdout { codec => rubydebug }
}

You can find the documentation around the input in our docs, but with that you can see if logstash is working well...

It will generate a line, parse it with the grok, and output to the console or a file if you want. You should see it parsed. After testing this you should plug the correct input and see if things are flowing. Then add conditionals to the filter, AND FINALLY add the elasticsearch output.

That way you will isolate issues and build it in an incremental way which will be helpful for you. Check the enabled modules, check that they are generating data, and check how things are flowing. Isolate things one by one, but do not complicate the Logstash pipeline at first and don't plug Elasticsearch until you are sure that the outputs are doing what you want to output.

Thanks!
--Gabriel

hey Gabriel,
I really appreciate your reply on this matter, but my struggle to make the logs pass through the filter continues.
I've followed all the instructions in the docs and watched a PluralSight course that explained how to install and configure ELK stack.. All my instances are working fine, just the filter doesn't and I don't know why. :frowning:

Hi Carlos,

Did you try the filters WITHOUT any conditionals?

I think that there might be a couple of things happening:

  1. Your Logstash configuration doesn't work because it's using modules and not the right configuration. The pipelines.yml file might be pointing at a specific pipeline and not the file that you have the filters. So it's not that the filter doesn't work, but that the entire config is not working as expected.
  2. The logical operators if / else aren't working as expected, but i don't think so which means that we go back to the previous point.

Try to do what i mentioned in my previous post, and following this: https://www.elastic.co/guide/en/logstash/current/running-logstash-command-line.html

bin/logstash -f <file-name-here>

Where filename is the path to the file that you have your logstash configuration. That might help you understand what is going on. If that works, then it's point [1] which is causing issues.

Thanks!

--Gabriel

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.