Filebeat clients overwrites agent field from grok pattern COMBINEDAPACHELOG with filebeat agent field which results with errors in logstash:
[ERROR][logstash.filters.useragent] Uknown error while parsing user agent data {:exception=>#<TypeError: cannot convert instance of class org.jruby.RubyHash to class java.lang.String>, :field=>"agent", :event=>#LogStash::Event:0x6b358a65}
One solution of the problem would be changing COMBINEDAPACHELOG grok pattern to COMMONAPACHELOG %{QS:referrer} %{QS:someotherfield} and modify useragent configuration.
Is there any configuration option on filebeat client - change field agent to something else ?
But the basic problem still persists - filebeat agent creates field with name agent and grok pattern COMBINEDAPACHELOG has same field name as filebeat agent.
I understand the conflict, glad there is a workaround the problem. But I agree we could make a bit of work to make sure that fields extracted from grok pattern doesn't do any conflict with fields defined by filebeat.
What if I create issue for filebeat instead ? Grok pattern COMBINEDAPACHELOG is used for a long time now, filebeat agent field feature has been added recently ? Or maybe feature request for filebeat - add configuration option for filebeat field name. According to https://github.com/elastic/beats/blob/master/CHANGELOG.asciidoc agent field name has been changed from beat.name to agent.type, beat.hostname to agent.hostname and beat.version to agent.version version 7.0.0-alpha1.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.