Unable to get the new field in Kibana, defined in logstash!

Hello, I am new to ELK stack and working on basics like getting a log through filebeat and view that in the kibana dashboard. I want to get an extra field in kibana from the part of existing message field. To do so, I managed to edit the logstash filter part by using grok filter and adding add_field.Then, when I refresh the fields of index pattern in kibana, I can not get the new field that I added in the logstash.conf. Is this the right way of working to get the new field on my kibana dashboard? Can anyone please guide me?

Here are my logstash conf files....
input.conf
input {
beats {
port => 5044
}
}

filter.conf
filter {
if [type] == "syslog" {
grok {
match => { "message" => "%{SYSLOGBASE}%{SPACE}%{GREEDYDATA}" }
add_field => { "error_message" => "%{GREEDYDATA}" }
}
}
}

output.conf
output {
elasticsearch {
hosts => ["localhost:9200"]
sniffing => true
manage_template => false
index => "%{[@metadata][beat]}-%{+YYYY.MM.dd}"
document_type => "%{[@metadata][type]}"
}
}

event log
Dec 19 13:59:33 ctrl2 systemd[1]: systemd-journal-upload.service: Consumed 26ms CPU time

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.