I have an old version of logstash exactly1.5.6
I have made a grok filter and I created it by http://grokdebug.herokuapp.com/
For grokdebug it is ok and I have my field. Here the example of grok filter and example of line
%{TIMESTAMP_ISO8601:received_at} %{GREEDYDATA:} %{USERNAME:username} %{IP:calling_station_id}.*LOG: %{GREEDYDATA:audit_type}:.*
2019-03-15 00:00:02 CET [unknown] postgres 192.168.9.11 976LOG: AUDIT: SESSION,4506,1,FUNCTION,EXECUTE,FUNCTION,gimp.prparameters_get_value,"SELECT * FROM gimp.prparameters_get_value('EMAIL_SENDER_ADDRESS',NULL)",<not logged>
Now I try to load in elasticsearch by logstash and I always have _grokparserfailure.
I try to reduce the grok filter and at the end with this simple configuration I have the same error
The configuration is
filter { if [message] =~ /^\s*Safed\[/ { mutate { replace => [ "source", "Safed" ] } } if [source] == "Safed"{ # Safed without stripped syslog-tag "Safed[...][...]" # SAMPLE AUDIT # 03-15-2019 00:00:02 CET [unknown] postgres 192.168.9.11 976LOG: AUDIT: SESSION,4506,1,FUNCTION,EXECUTE,FUNCTION,gimp.prparameters_get_value,"SELECT * FROM gimp.prparameters_get_value('EMAIL_SENDER_ADDRESS',NULL)",<not logged> grok { patterns_dir => "/var/lib/neteye/logstash/etc/pattern.d" match => [ "message","%{DATESTAMP:prova_date}.*" ] overwrite => [ "message" ] remove_tag => "_grokparsefailure" break_on_match => false } } }
What is the problem of my little configuration?
Thank you
Franco