Modelling to Elastic Common Schema (ECS): Best practices in logstash

hi
Being a big fan of ECS, I understand most of them are built-in for known products and technology if beats does it, but how to tackle it at logstash?

My scenario is like

  • Data arrives to data lake via syslog and is in RFC5424
  • Beats pick it up and sends without modification to logstash (So module used here)
  • Logstash does the Data modelling and transformations
  • If I used the patterns like "linux syslog" , the fields are NOT in ECS
  • While the syslog fields syslog5424_host etc are good quality extractions

So my query is

  1. How to make the fields extracted from the patterns into ECS fields?
  2. Should I do "mutate" in my custom filters one by one? Or is there a easier/better way?

I'm currently planning to do something like in logstash... (Just checking if this is the efficient way)

filter {
mutate {
copy => { '[srcip]' => '[source][address]' }
copy => { '[srcip]' => '[source][ip]' }
copy => { '[new_event][srcip]' => '[source][ip]' }
rename => { '[srcport]' => '[source][port]' }
convert => { '[source][port]' => 'integer' }
copy => { '[destip]' => '[destination][address]' }
}
}

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.