On ingest I need to create a new field based on the value of a field in the document

I have a field X that has the value fw.auth.deny or fw.auth.allow How can I create a new field that is called y that has the value fw in it using logstash or mapping templates?

Thanks in advance.

Chris

P.S. I looked around a lot and could not find an example of how to do this.

Use a grok filter. Why should "fw" be extracted? Does X always begin with "fw"? Or do you want to extract everything up to the first period?

I have a bunch of patterns that I need to match

fw.auth.allow
fw.auth.deny
should set a new field to fw

av.detect.virus
av.delete.virus
av.detect.content
av.delete.content
should set a new field to av

and etc, etc, etc

Is there some where I can find a logstash config file with a similar example to what I need to do?

Chris
P.S. Thanks.

Would this work?

input {
file {
path => "/data2/logstash/*.csv"
type => "utm"
start_position => "beginning"
}
}

filter {
csv {
columns => ["g_date", "g_hour", "g_hostname", "nsm_type",
"s_ip", "t_ip", "n_ip", "t_port", "transport_proto", "in_interface",
"out_interface", "group_id", "user_id", "object_name", "object_access",
"http_hostname", "uri", "s_port", "s_bytes", "t_bytes", "duration",
"g_timestamp"]
separator => ","
}
mutate {
lowercase => [ "g_hostname" ]
}
}

filter {
if [nsm_type] == "fw.auth.allow" [nsm_type] == "fw.auth.deny" {
grok {
add_field => [ "service", "fw" ]
}
}
}

output {
elasticsearch {
action => "index"
hosts => ["10.140.56.140:9700", "10.140.56.141:9700", "10.140.56.142:9700", "10.140.56.143:9700", "10.140.56.144:9700", "10.140.56.145:9700"]
index => "utm-%{g_date}"
workers => 12
}

stdout {

codec => rubydebug

}

}

Looks pretty reasonable. To obtain the first period-delimited token from nsm_type, use a grok filter that extracts all characters at the beginning of the string except periods into a new field:

grok {
  match => {
    "nsm_type" => "^(?<desired-name-of-field>[^.]+)"
  }
}