Sending Extracted Fields to Kibana


(sushmitha) #1

Hi,

i have modified the logstash configuration file to extract some fields. The extracted fields are dispalying in logstash.stdout file but are not showing in kibana.

my configuration file is like this:

filter {
if [type] == "cloudera" {
grok {

  match => { "message" => "%{SYSLOGTIMESTAMP:syslog_timestamp} %{SYSLOGHOST:syslog_hostname} %{DATA:syslog_program}(?:\[%{POSINT:syslog_pid}\])?: %{GREEDYDATA:syslog_message}" }
  add_field => [ "received_at", "%{@timestamp}" ]
  add_field => [ "received_from", "%{host}" ]
}
syslog_pri { }
date {
  match => [ "syslog_timestamp", "MMM  d HH:mm:ss", "MMM dd HH:mm:ss" ]
}

output {
elasticsearch { host => localhost }
stdout { codec => rubydebug }
}
how to send these extracted fields from logstash to kibana?


(Mark Walkom) #2

Which fields in particular?


(sushmitha) #3

Thanks for the reply.
I want to display the fields received_at and received_from in kibana.


(Mark Walkom) #4

@timestamp is only generated when the doc is sent to Elasticsearch, so that add line won't work..

Are you seeing the received_from field in stdout?


(sushmitha) #5

yes, i'm seeing received_from in stdout

here is my ouput:


(Mark Walkom) #6

Then it'll be uploaded to ES and visible via KB.

Can you not see this in the documents under the Discover tab?


(sushmitha) #7

No,these fileds are not displaying under Discover tab.

Do i need to do any other changes to send these fields to kibana?


(Mark Walkom) #8

Try refreshing your fields under the index settings?


(sushmitha) #9

Thanks Mark Walkom,

Now,I'm getting extracted fields in Kibana.


(system) #10