Grok filter add_field


I've tried using a add_field in the grok filter.
I want to extract the domain name from the log files I have.

I've added those in my Apache logs and see them, but I'm not sure how to extract them.
What I have so far is this:

input {
beats {
port => 5044
host => ""

filter {
grok {
match => { "message" => "%{COMBINEDAPACHELOG}" }
add_field => [ "host" => "%{host}" ]
date {
match => [ "timestamp" , "dd/MMM/yyyy:HH:mm:ss Z" ]

output {
elasticsearch { hosts => ["localhost:9200"] }
stdout { codec => rubydebug }

I'm not sure if it's even correct.
I've restarted everything, but don't see it in Kibana.

An example of a line from the log file: - - [07/Apr/2020:14:58:17 +0200] "GET /images/betaalmethodeimages/paynl/mastercard.png HTTP/1.1" 200 3436 "" "Mozilla/5.0 (Linux; Android 10; ELE-L29) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/80.0.3987.162 Mobile Safari/537.36"

I hope you guys can help me out!

That would set the [host] field to the existing value of the [host] field, which doesn't make a lot of sense. What are you trying to do?

Hi @Badger,

What I'm trying to do is filter out the domain name for later use.
Right now we have about 1000 different domain names which we want to use.

Unfortunately right now I can't get the information out of the log.

Are you saying you want to parse the domain name from a fully qualified host name?

@Badger I want to filter out the domain name from my log files.
Each line has the domain name written in it.

That way I can use those domain names in Kibana as a filter.

Can anyone help me with this?
I'm still stuck.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.