Edit IIS Module Pipeline

Hi,

I'm i a company where I need to ingest data from IIS.
I've enable the module but i'd like to modify a bit the pipeline. I've found that pipeline are stored in filebeat modules folder (on window : filebeat/module/iis/access/ingest/pipeline.yml). But, from my test pipelines are not executed in filebeat but transmitted to Elasticsearch.

And so, here is my issue. I need to parse the url.path to add the first part into another field.

I've tried multiples things :
Since i'm working with logstash, i've tried to make a grok pattern here. unfortunately, the url.path field does not exists yet because my IIS event has not been parsed yet.
I've also tried to edit the access/pipeline.yml and add the grok patter right after the messéage has been parsed.
Unfortunately, since the pipeline file is not executed by filebeat nor transmitted to elastic (i might be wrong on this part feel free to correct me) my new grok filter is not executed.

If anyone has an idea on how i could add a grok filter in IIS pipeline, i would realy appreciate

Oh and btw, i'd really appreicate not to reparse the message in logstash, i'm looking for an other solution that would be more "efficient"

Thanks by advance

You are correct that the ingest pipeline u see in Filebeat is pushed to elasticsearch when the module is setup. U can modify the pipeline in Kibana to add additional processors. Under the Stack Management page, there is the Ingest Node Pipelines page. Just search IIS and it should show up. You'll need to make the modifications each time you update filebeat.

1 Like

The filebeat module uses an ingest pipeline that runs on elasticsearch ingest nodes, this works if you send your data directly from filebeat to elasticsearch.

If you send your data to logstash first you need to configure logstash to use the ingest pipelines in elasticsearch.

Without that configuration in your logstash output, the ingest pipeline created by the filebeat module will not even be used.

I think that the best solution for your use case is to configure logstash to use the ingest pipeline in elasticsearch, so you won't need to parse the message in logstash, and to create a new ingest pipeline in elasticsearch and use this pipeline as the final_pipeline.

You just need to create a new ingest pipeline and then change the template of your index to use this pipeline as the index.final_pipeline.

This ingest pipeline would always be executed after the module ingest pipeline create by filebeat, this way you avoid to having to edit the module ingest pipeline when you update filebeat.

2 Likes

Thanks to both of you @legoguy1000 and @leandrojmp !
I'll have a try right now and i'll let you know !

Alright so, I've modified IIS the ingest pipeline. Now I'd like to do it the way @leandrojmp advise.
By any chance, do you know how i need to implement the conditionnal processors.
I've tried to do so the same way i did in logstash but it did not worked and i couldn't find any doc on this

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.