Parse log using logstash and make ecs format

i Receiving WAF(Web Application FireWall) log, Network Scan Result log file(xml)
And try to send this log to ElasticSearch using logstash

i want to parse this log according to ECS Format.

but, "" is It just output the specification for ecs, and I don't know how to use it.

WAF log, Scan log is custom log, There is no related module in filebeat.

I think it will take a long time to build the filebeat module, so I try to use Logstash.

Ask if there is an example related to it.

Hi @111387!

I don't have a specific example Logstash config for either a WAF or network scan log to direct you towards. I encourage you to review some of the past discussion threads discussing Logstash and ECS for general guidance, such as here and here.

As you build out your Logstash ingest pipelines, you'll want to look carefully not only at the correct field names but at the field data types as well. The ECS GitHub repo also contains some additional resources to help, including example Elasticsearch index templates and tooling to help users manage their own custom field definitions.

I'd also highly recommend reviewing the following areas of the ECS documentation:

I overlooked mentioning ecs-mapper! :smile:

The ecs-mapper tool can take a field mapping CSV to an equivalent pipeline for Logstash as well as Elasticsearch and Beats. You can view an example Logstash output from ecs-mapper here.

The ecs-mapper tool does not appear to understand that mutate does things in a fixed order, so that example output will not work.

copy => { '[destport]' => '[destination][port]' }
convert => { '[destination][port]' => 'integer' }

convert is executed before copy. That's why using rename is good, it gets done early.

1 Like

Good catch @Badger!

I've filed an issue in the ecs-mapper GitHub repo.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.