Removing fields from logstash

input {
  beats {
    port => 5044
  }
}
filter {
    if "/var/log/abc.log"  in [log][file][path]{
       mutate {
               add_field => { "employee_id" => "abc123" }
       }
    }
 }

this is my input configuration for logstash

I don't see any real question in your last comments, so not sure what you are looking for..

For logstash, you could do something like:

filter {
  if [@metadata][env] {
    if [@metadata][aix] {
      if [@metadata][env] == "int"{
        mutate{
          add_field => { "environment" => "dev" }
        }
      }
      if [@metadata][env] == "tst"{
        mutate{
          add_field => { "environment" => "int" }
		  update => { "[@metadata][domain]" => "vaja_tst2" }
        }
      }
      if [@metadata][env] != "int" and [@metadata][env] != "tst" {
        mutate{
          add_field => { "environment" => "[@metadata][env]" }
        }
      }
    }
  }
  #If variable exists
  if [@metadata][rhel] {
    mutate{
      add_field => { "[@metadata][domain]" => "%{environment}justxdom" }
    }
  }
}

actually, i want to fetch the log details in fields

[Debug] : 2023-09-04T11:22:24 -> [Request - {"ABC":{"Service":{"Channel":{"HostIP":{"Type":"IP","Value":"0.0.0.0"},"Name":"ABC.COM","Type":"AGENT"},"Device":{"IP":{},"OS":"web","Version":"116"},"Name":"ABCD","Type":"XYZ","Code":"IN"

I tried the grok function too but unable to implement

Are you sure this is the full log line?
There is begging [Request but no square bracket ] at the end. Also JSON structure is not complete.

Hi @bharti, have you thought about using a different strategy? Maybe you can use the ingest pipeline instead.
Here the wiki
Let me know if this can be a helpfull. :slight_smile:

yes yes
I just put a small part of the JSON just to show

Hi Samuele ...unable to understand :sweat_smile:

You are on the right way:

You can operate on the fields you get afterwards..

PS: Logstash is a dedicated product for logs ingestion and way easier to operate than the elasticsearch ingestion pipelines. So I discourage the suggestion from Samuele_Lolli.

Sorry for not being clear.
Instead of removing field directly in logstash you can use the ingest pipeline.
The use is something like that:
In the output part of logstash you put the name of the ingest pipeline and then in kibana you create the ingest pipeline.
You can use different processors to do a lot of different operation

Let me know if now is clear

input {
  beats {
    port => 5044
  }
}
filter {
    if "/var/log/abc.log"  in [log][file][path] {
      grok {
        match => { "message" => "status: %{WORD:Status}" }
      }
    }
}

my logstash input configuration

Can you add this output plugin and post a sample output-message? So we now how your full messages looks like?
It's quite hard to help you with parsing, if we don't know how everything looks like..

output {
	file {
		path => "/<path>/logstash_debug/incomingMessages.log"
		codec => rubydebug {metadata => true}
	}
}

sorry chouben ...but the message part is confidential

just think of a simple json message ...my data just look like that only

[Debug] : 2023-09-05T08:59:40 -> [Response - {"abc":{"Service":{"Channel":{"HostIP":{"Status":"APPROVED"},
"Name":"abc.COM","Type":"AGENT"}}}}]

sent one customised message

here I want that status part in a separate field ....so how can I achieve that

input {
  beats {
    port => 5044
  }
}
filter {
    if "/var/log/abc.log"  in [log][file][path] {
      grok {
        match => { "message" => "status: %{WORD:Status}" }
      }
    }
}

am trying to fetch the status part
like
status APPROVED

in the event

My assumption is that this part ends up in the "message" field:

That implies your parsing won't work. You see the grok pattern won't match the log line above..

Take a look at my Filebeat configuration above. I do parse the logline already in multiple fields. You should do something similar.

so you are saying that i have to parse the whole json in the match part?