Grok Ingest Pipeline and the Elastic Common Schema

Hi there,

I want to use the Grok patterns already available in the Elastic Ingest pipeline; however, I don't see how I can map the default field names provided to the Elastic common schema? Is there any work towards that currently? I have seen a bunch of articles about using custom Grok with ECS, but not using the patterns already available. Not sure if I missed something obvious.

Roshan

I think an example might help, what existing ingest pipeline? What version of Elastic?

Hey sorry for the lack of details.

I was just looking at the code to see if it was possible first for our project: https://github.com/elastic/elasticsearch/blob/master/libs/grok/src/main/resources/patterns/bro.

We are trying to ingest BRO data, and I was hoping to use the GrokParser. However, I noticed that the fields are mapping to a custom field name that is not compatible with ECS. For example, I was expecting the resp_h field to be destination.ip from ECS, https://www.elastic.co/guide/en/ecs/current/ecs-destination.html.

There seems to be doc https://www.elastic.co/guide/en/beats/filebeat/current/exported-fields-zeek.html

destination.ip is a ECS common field. Looking at the date of that github repo, its from Feb 2018. There is a filebeat Zeek module https://www.elastic.co/guide/en/beats/filebeat/current/filebeat-module-zeek.html

Are you using filebeat?