AH!--Websites being setup as fields--KV plugin

Oh wow, my number of fields blew up last night after I added a new server pushing logs to our logstash server. For some reason it's taking all of websites and setting those as fields. I think it's related to the KV plugin, and I put the following, but the websites are still being added.

Any idea what I'm doing wrong to block those websites from being created as fields? Here's what it looks like:

kv {
exclude_keys => [ "http*" ]
remove_field => [ "http*" ]
source => "foo"
}

I added the http lines this morning after the fields count skyrocketed overnight. Any help would be appreciated, thank you!!

What does the rest of the configuration look like? What does an input line look like? And an event produced by Logstash?

OK, don't laugh, my firewall logstash configuration is quite a bit more complex, but for this linux server one, I'm a firm believer in less is more a lot of times, so:

filter {

metrics {
    meter => "documents"
    add_tag => "metric"
    flush_interval => 60
}

grok {
      tag_on_failure => ["_failed1"]
      match => ["message", "%{GREEDYDATA:foo}"]

}
kv {
exclude_keys => [ "http*" ]
remove_field => [ "http*" ]
source => "foo"
}

That's it!

As far as an event produced by logstash, here's where something weirder happens. I get an error of " Courier Fetch: 2 of 24 shards failed. ", which is odd because other searches work fine just not when using those website fields.

So, in Kibana I'm typing something like this(without the quotes: "http://website.com:*" just to see if I can get you an event, and got that error. Also, I did look up in Kibana that http://website.com.keyword is a searchable field.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.