Kv plugin behavior


(Alan Frabutt) #1

Greetings all,
I believe I've made a solid effort digging around other posts (and sites) for more info, but I'm not getting traction. The behavior I'd expect from the code below is that it would break out each kv pair in the messages field into discrete searchable things. Not seeing that tho.

filter {
   if [type] == "syslog" {
     mutate {
       gsub => [ "message", "\"", ""]
       gsub => [ "message", ": ", "="]
     }
     kv {
        source => "message" 
        include_keys => ["sysloghost", "severity", "facility", "programname", "procid"]
     }
   }
 }

Any obvious problems with that?

Many thanks in advance!


(Magnus B├Ąck) #2

What does the input look like? What output do you get?


(Alan Frabutt) #3

Thanks for the prompt response! 1st I'd like to verify I'm not operating under false assumptions. If I were to configure logstash with the syslog input plugin, and direct a vanilla Linux rsyslogd log stream to it, then output to elasticsearch, should I expect kibana to show the individual kv pairs inside the message field to exist as searchable fields? I'm sanitizing example data now for upload.


(Ry Biesemeyer) #4

The KV filter plugin allows you to specify your field_split and value_split character classes (or even full splitter patterns with field_split_pattern and value_split_pattern in later releases); it is also pretty good at capturing quoted values. You might not need to do gsub operations to clean up the data before invoking the kv filter.

Docs for current version are here.


(Alan Frabutt) #5

Let's just close this one out... the problem doesn't appear to have anything to do with kv. I've tried following the example at
https://www.elastic.co/blog/how-to-centralize-logs-with-rsyslog-logstash-and-elasticsearch-on-ubuntu-14-04, and I'm getting _jasonparsefailure, so I'll try to follow that breadcrumb trail.


(system) #6

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.