ECS Field Name (Fortigate Dataset)

I am collecting Fortigate logs with the Elastic Agent and the Fortigate integration. These are then being shipped to Logstash for some custom enrichment before being pushed to Elastic Cloud. One of the custom enrichment fields is adding the field source.user.account with the value from source.user.name for later translation. However, I cannot find the correct syntax for the source field.

In an event, the field name is souce.user.name, which is the same in the JSON viewer, here is a screenshot of it:

image

image

However, I have tried the below variations with zero success.

source.user.name
[source.user.name]
[source.user][name]
[source][user.name]
[source][user][name]
source.user.name.text
[source.user.name.text]
[source.user][name][text]
[source][user.name][text]
[source][user][name][text]
user.name
[user.name]
[user][name]

The weird thing is, I've put the add field action behind a logic condition using if "source.user.name" { } and the add field action initiates, meaning source.user.name is present, but it is adding the literal value in the right side of the add_field setting. For example, this:

      mutate {
        add_field => {
          "[source][user][account]" => "%{[source][user][name][text]}"
        }
      }

results in this:
image

I guess the pipeline config would help...it's gone through a few iterations, but it's still doing the same thing.

filter {
  if [data_stream][namespace] == "fortigate" {
    if "[user][name]" {
      mutate {
        add_field => {
          "[source][user][full_name]" => "%{[user][name]}"
        }
      }
    }
  }
}

image

Can you provide more context about how you are sending the logs and what you are doing in Logstash?

The Elastic Agent and its integrations uses Elasticsearch Ingest pipelines to parse the data, you won't have any parsed field in Logstash so it is not clear what you are trying to enrich and how.

Elastic Agent outputting to logstash then logstash sending it to Elastic Cloud. This setup is being recommended by Elastic itself. I want to enrich the event data with additional information using the translate filter.

Your comment gave me the idea to output the events to file and see what they look like. Turns out the agent outputs a JSON formatted event with fields related to the agent, ECS, and a couple other things. It places the original, unparsed event in the message field and then ships it to Elastic for the ingest pipeline to parse.

Since the source of my translations is information only available in my environment, I modified the Logstash pipeline to copy the message field, parse the copied field and place the new fields under the temp field (temp.firewall.name for example), and then from here I can run my translations. Once I've finished all my enrichments, I delete the temp object, removing all the temporarily parsed fields.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.