Ingest Specific Json fields to Elastichsearch

Hi I am ingesting following Json data to elasticseach through Logstash . There are around 100 different fields in that document. I am only interested in sending 4 fields ie linux.domain, linux.environment_id, linux.hostname and linux.id .
Is there a better to include only 4 fields I am interested in sending to elasticserch instead of using remove_field option and manually add all 100 fields in the config.

grok {
remove_field => ["linux.architecture","linux.architecture_id","linux.architecture_name"]-----> do not want to remove manually


My logstash config :

input {
 beats {
        port => "5044"
        type => ["doc"]
       }
}

filter {

grok {

match => {"message" => "^{?\n?.*?(?<logged_json>{.*)" }

}

json

{
      source => "logged_json"
      target => "linux"
#      remove_field => "parameter"
}

}
output {
if [type] == "doc" {

   elasticsearch {
        hosts => [ "10.138.7.51:9200" ]
        index => "testing-%{+YYYY-MM-dd}"

}
}
else if [type] != "doc"  {
  elasticsearch {
        hosts => [ "10.138.7.51:9200" ]
        index => "tester-%{+YYYY-MM-dd}"

}

}

Use a mutate filter to rename (i.e. move) the wanted fields to the top level of the events, then use a mutate filter to remove the linux field and all of its subfields.

Note the syntax Logstash requires for the nested fields you want to rename: https://www.elastic.co/guide/en/logstash/current/event-dependent-configuration.html#logstash-config-field-references

1 Like

Thanks Magnus ,

Worked as per your suggestion .

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.