Sending Extracted Fields to Kibana

Hi,

i have modified the logstash configuration file to extract some fields. The extracted fields are dispalying in logstash.stdout file but are not showing in kibana.

my configuration file is like this:

filter {
if [type] == "cloudera" {
grok {

  match => { "message" => "%{SYSLOGTIMESTAMP:syslog_timestamp} %{SYSLOGHOST:syslog_hostname} %{DATA:syslog_program}(?:\[%{POSINT:syslog_pid}\])?: %{GREEDYDATA:syslog_message}" }
  add_field => [ "received_at", "%{@timestamp}" ]
  add_field => [ "received_from", "%{host}" ]
}
syslog_pri { }
date {
  match => [ "syslog_timestamp", "MMM  d HH:mm:ss", "MMM dd HH:mm:ss" ]
}

output {
elasticsearch { host => localhost }
stdout { codec => rubydebug }
}
how to send these extracted fields from logstash to kibana?

Which fields in particular?

Thanks for the reply.
I want to display the fields received_at and received_from in kibana.

@timestamp is only generated when the doc is sent to Elasticsearch, so that add line won't work..

Are you seeing the received_from field in stdout?

yes, i'm seeing received_from in stdout

here is my ouput:

Then it'll be uploaded to ES and visible via KB.

Can you not see this in the documents under the Discover tab?

No,these fileds are not displaying under Discover tab.

Do i need to do any other changes to send these fields to kibana?

Try refreshing your fields under the index settings?

Thanks Mark Walkom,

Now,I'm getting extracted fields in Kibana.