This are the fields explanation
date time time-taken bytes cs-method cs-uri sc-status sc(X-ORACLE-DMS-ECID) cs(ECID-Context) cs(Proxy-Remote-User) cs(Proxy-Client-IP)
It's been a long while since I worked on WebLogic, but I believe you can alter the files at the Log4j level, which is easier than using grok. At the very least, it would allow you to use custom separators so you could more easily tokenize with either the grok or kv filters.
Thanks Aaron, yes i can change it at Log4j level, but this will be used for several domains, then i will need to update a lot of weblogic servers, so at this point i am trying to parse the original log file. This is problably why i am not able to parse this access logs.
There doesn't appear to be any spaces in your grok expression in which case that's one obvious reason why things aren't matching.
To debug grok expressions I recommend starting off simple with e.g.
%{ISO8601_DATE:date}.*
and making sure that works. Pipe test data to Logstash via a stdin input and use a stdout { codec => rubydebug } output (or use https://github.com/magnusbaeck/logstash-filter-verifier). Then add one token at a time, i.e. your next step would be
root@vagrant-ubuntu-trusty-64:/opt/logstash# bin/logstash -f /etc/logstash/conf.d/1.conf
Default settings used: Filter workers: 1
The error reported is:
pattern %{ISO8601_DATE:date} not defined
The error reported is:
pattern %{ISO8601_DATE:date} not defined
There is no grok pattern named ISO8601_DATE. Where did you get the idea to use it? Perhaps you need to configure Logstash to use additional pattern files. But TIMESTAMP_ISO8601 exists and should match your timestamp.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.