[Ingest pipeline] tag and trace match

Hi,

I try to migrate my logstash filters to ingest pipeline but I'm stuck on an issue:
I'm parsing VMware ESX logs with a grok pattern and I would like to add a field if grok pattern match

In logstash I've something like

   grok {
        match => [
                 "message" , "%{DATA:event.security} %{DATA:event.provider} \[%{DATA:event.module}\] %{GREEDYDATA:event.reason}",
                 "message" , "%{GREEDYDATA:event.reason}"
                ]
        add_field => { "host.os.type" => "ESX"}
   }

I don't see how to do that with ingest pipeline.

My first try was to do a pipeline with

  • grok processor :
    pattern=%{DATA:event.security} %{DATA:event.provider} [%{DATA:event.module}] %{GREEDYDATA:event.reason}
    pattern=%{GREEDYDATA:event.reason}
    tag=grok1
  • set processor:
    if ctx.tag == grok1
    set host.os.type = ESX

But when I look in kibana for "tags" field it's empty, so "set processor" doesn't work

  • is my logic good ? if not how to do this ? is that possible to run a processor only if previous one succeed ?
  • is that normal that tag field is not filled by my grok processor ?

Other thing : when I use "trace match" option in grok processor and "_simultate" I'm able to see _ingest metadata to debug. These _ingest metadata are only visible in "simulate" mode ? I'm not able to see _ingest metadata in kibana/discover when I look at documents

Thanks for your help

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.