Is there anyway I can tag the elasticsearch server name in the field?

my data flows from source server --> redis (on elasticsearch server A) --> elasticsearch (same server as redis run).

what i know is logstash can only parse whatever available in the message or path, elasticsearch is not available in the document, for this case, is there any way i can create a field of elasticsearch server name ??

below is an example document

"message" => "09/22/16: 11:09:31.904: \n RECEIVING INTERNAL message, acitype=0x44440003 antype=32000 size=105\n FROM: LSSDADM100SFCSrv \n TO: LSSDADM100SFCSrv \nfr=LSSDADM100SFCSrv to=LSSDADM100SFCSrv do=sfc chart=FwHostSFC cctxt=LSSDADM100 step=S8 noreply nosyntax\n==================================================",
"@version" => "1",
"@timestamp" => "2016-09-22T03:09:32.709Z",
"host" => "fslautodev001",
"path" => "/var/mtapps/ashl/logs/AMAT/ProducerGT/LSSDADM100/mq-sLSSDADM100.log.th",
"type" => "Auto::Host::mqs",
"tags" => [
[0] "multiline"
]

So where is Logstash running? In your data flow graph I only see the source serve (is that where Logstash runs?), Redis, and Elasticsearch.

i got logstash running on both source and elk server. the one running on source server one ships the log output to redis. another logstash running on elk server do the actual parsing and output to elasticsearch.

Perhaps you can read the HOSTNAME environment variable? See https://www.elastic.co/guide/en/logstash/current/environment-variables.html. Otherwise you can obtain the hostname via a ruby filter (try googling "logstash ruby filter gethostname").

hi Mag, yeah this is exactly what I want, but just now i did a quick try, seems it can not get the data.
btw i use logstash 2.3.2

logstash config:
input {
redis {
host => "localhost"
#type => "redis-input"
data_type => "list"
key => "HostLog"
}
}
filter {
grok {
match => {"path" => "/var/mtapps/ashl/logs/(?[A-Za-z0-9]+)/(?[A-Za-z0-9_]+)/(?[A-Z0-9]+)/%{GREEDYDATA}"}
}
grok {
match => { "message" => "^(.)?(?%{DATE_US}: %{TIME}):%{GREEDYDATA}"}
}
mutate {
add_field => { "elkserver" => "${HOSTNAME}" }
}

if ([type] == "Auto::Host::mqs")
{
kv{
field_split => " \r\n"
include_keys => ["fr", "to", "SF", "do", "command", "reply", "SchedState", "SubState", "MESCode", "MESErr", "BatchId", "CEID", "SubEquipId", "ProcessName", "EventName" ]
}
}
}

output {
elasticsearch {
hosts => ["fslelk02"]
index => "host-%{+YYYY.MM.dd}"
workers => 4
}
}

And i check the environment variable do have values:
HOSTNAME=fslelk02

And did you start Logstash with --allow-env?

tried to run in cmd line /opt/logstash/bin/logstash -f indexer_auto_host.conf --allow-env
and it can get the correct value now.

however normally we run logstash as daemon, is there any way to config this in .conf or startup file?

There's an environment variable you can set in /etc/default/logstash (Debian) or /etc/sysconfig/logstash (RPM) that contains extra options to pass to Logstash.