Oh wow, my number of fields blew up last night after I added a new server pushing logs to our logstash server. For some reason it's taking all of websites and setting those as fields. I think it's related to the KV plugin, and I put the following, but the websites are still being added.
Any idea what I'm doing wrong to block those websites from being created as fields? Here's what it looks like:
OK, don't laugh, my firewall logstash configuration is quite a bit more complex, but for this linux server one, I'm a firm believer in less is more a lot of times, so:
As far as an event produced by logstash, here's where something weirder happens. I get an error of " Courier Fetch: 2 of 24 shards failed. ", which is odd because other searches work fine just not when using those website fields.
So, in Kibana I'm typing something like this(without the quotes: "http://website.com:*" just to see if I can get you an event, and got that error. Also, I did look up in Kibana that http://website.com.keyword is a searchable field.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.