How to index a custom field that has been added to logstash.conf

I've added a custom field to the logstash.conf , it is shown in kibana UI Console but the field "actual_host_is" is not indexed. how to index a custom field added to logstash.conf

add_field => {
"actual_host_is" => "${HOSTNAME}"

Can you elaborate your config and sample data a bit more?
Are you sure that the new field is not indexed or could it be a kibana index pattern issue?

Thank you Admlko for your response. Ya certainly I will describe the details more.


I am running nginx server in docker container redirecting to logstash as below

docker run --log-driver=syslog --log-opt syslog-address=tcp://localhost:5000 --log-opt syslog-facility=daemon --name MyNginx -d -p 80:80 nginx:latest


below is logstash.conf for capturing nginx access log

input {
tcp {
type => syslog
port => 5000
add_field => { "[@metadata][input]" => "tcp"}

filter {
grok {
match => { "message" => "%{IPORHOST:remote_ip} - %{DATA:user_name} [%{HTTPDATE:access_time}] "%{WORD:http_method} %{DATA:url} HTTP/%{NUMBER:http_version}" %{NUMBER:response_code} %{NUMBER:body_sent_bytes} "%{DATA:referrer}" "%{DATA:agent}"" }
add_field => {
"actual_host_is" => "${ACTUAL_HOST}"

output {
index => "logstash-%{+YYYY.MM.dd}"


below is kibana console output at http://localhost:5601/

@timestamp January 2nd 2019, 11:41:55.674
t @version 1
t _id Y3InDmgBfuiwMFhB3rcy
t _index logstash-2019.01.02
# _score -
t _type doc
t access_time 02/Jan/2019:10:41:54 +0000
? actual_host_is kmaster

please see the last line, actual_host_is kmaster, I have got the hostname but the problem I am getting in kibana console is "UNindexed fields cannot be searched", other fields of nginx access logs are searchable. I want "actual_host_is" field indexed for searching which is not happening


Please format your post correctly, I will take a look at it after formatting :slight_smile:

Yeah, correct, it was because of = signs I used as separator between different config files, it was taken as bold, I used new lines before and after = signs to avoid bold big words, apologies for the mishap. please have a look hope it's fine now. thanks, Baidurjya

Please use block quotes and preformatted text also for configs.

Where does the actual_host_is come from? I don't see it in the grok config...
What do you mean by Kibana console output? The Discover page in Kibana, or Console page?

thanks for your reply.

I am running logstash with env variable as ACTUAL_HOST=$(hostname) as below.

docker run -h logstash --name logstash --network incowia_default --link elasticsearch:elasticsearch -it --rm -p 5000:5000 -e ACTUAL_HOST=$(hostname) -v "$PWD":/config-dir logstash:6.5.1 -f /config-dir/logstash.conf

then I am capturing this ACTUAL_HOST field from the previously given grok file of logstash.conf file.

this hostname (kmaster in this case) is flowing to KIBANA DISCOVER page successfully, but not searchable, with error message "UNindexed fields cannot be searched"

now I must tell you motivation of doing this, there are few VMs and each VM has one nginx docker instance and a logstash docker instance and (ELASTIC+KIBANA) is running in a central server. Logstash from each VM is capturing nginx log of that VM and redirecting to central Kibana server, so to understand which particular VM NGINX access log is generated, I have added this custom hostname field so that I can understand source VM yielding nginx access log.

Thanks for clarifying, makes sense.

I didn't realize that the actual_host_is comes fron an environment variable, but was under the impression that it comes from the log.

This sounds like an index pattern issue, please try to refresh your index pattern from Kibana's Management app:

1 Like

Thanks very much Admlko, refreshing the index pattern made the custom field "actual_host_is" searchable. You saved my life. :slight_smile: many many thanks

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.