I don't have a specific example Logstash config for either a WAF or network scan log to direct you towards. I encourage you to review some of the past discussion threads discussing Logstash and ECS for general guidance, such as here and here.
As you build out your Logstash ingest pipelines, you'll want to look carefully not only at the correct field names but at the field data types as well. The ECS GitHub repo also contains some additional resources to help, including example Elasticsearch index templates and tooling to help users manage their own custom field definitions.
I'd also highly recommend reviewing the following areas of the ECS documentation:
The ecs-mapper tool can take a field mapping CSV to an equivalent pipeline for Logstash as well as Elasticsearch and Beats. You can view an example Logstash output from ecs-mapper here.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.