I am new here i am trying to parse my tomcat application logs using grok filter but it does'nt work.
Basically, I'm reading a log file using filebeat and send to logstash. and logstash i used for filtering.
i apply some grok patterns but it does'nt work my log files contains.
16/02/2017 19:21:17:549 INFO - 6461C963C390DCAEF6F8F58A3AB864A9:/RCOM_PREPAID : Storing :input_customAudioLocation to simple: input_customAudioLocation as [DEFAULT]
16/02/2017 19:21:17:549 INFO - C544704513DEDEF396611BAD12A651B3:/RCOM_PREPAID : Storing :input_confirmationDeniedApology to simple: input_confirmationDeniedApology as [DEFAULT]
16/02/2017 19:21:17:549 DEBUG - 077F64575CD008E4FF413AE352989E2A:/RCOM_PREPAID : no variable for simple: __VPvpms
16/02/2017 19:21:17:549 ERROR - 46C6765BE2F8CC30F35575FF4611C280:/RCOM_PREPAID : session id:cgrmpp07-2017047130303-397 | Error processing request
EXCEPTION>
java.lang.ClassCastException: com.avaya.sce.runtime.ReturnError cannot be cast to com.avaya.sce.runtime.IPostGenerator
at com.avaya.sce.runtime.AppDocument.processRequest(AppDocument.java:251)
at com.avaya.sce.runtime.SCEServlet.requestHandler(SCEServlet.java:285)
at com.avaya.sce.runtime.SCEServlet.doGet(SCEServlet.java:182)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:624)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:731)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:303)
16/02/2017 19:23:28:745 INFO - 28E116CB857AA3439D9261B2B617E545:/RCOM_PREPAID : Using SCESession 28E116CB857AA3439D9261B2B617E545:/RCOM_PREPAID servlet : LinkDown
16/02/2017 19:23:28:730 DEBUG - 33AFED9A8EDFA7BF36E1A116BE52929C:/RCOM_PREPAID : Report XML:<?xml version="1.0" encoding="utf-16" standalone="yes"?>
<IVRREPORTDATA>
<CALLINFO>
<UNIQUECALLID>10019531311487253188</UNIQUECALLID>
<SESSIONID>33AFED9A8EDFA7BF36E1A116BE52929C:/RCOM_PREPAID</SESSIONID>
<VXMLIP>CGRXMA07/10.132.41.87</VXMLIP>
<APP_NAME>RCOM_CF_198_GSM_PREPAID</APP_NAME>
<HUB>MO</HUB>
<DNIS>9024019000</DNIS>
<DNISTYPE>NA</DNISTYPE>
<STARTDATETIME>16/02/2017 19:23:23</STARTDATETIME>
my file beat configuration is
filebeat.prospectors:
- input_type: log
paths:
- C:\ES-Apache\logs\logs_16 feb -2017_prepaid\*.log*
document_type: prepaid_logs
output.logstash:
hosts: ["127.0.0.1:5043"]
# Number of workers per Logstash host. default 1
worker: 4
this is my logstash configuration:
input {
beats {
host => "127.0.0.1"
port => "5043"
}
}
filter {
if [type] == "prepaid_logs" {
if "_grokparsefailure" in [tags] {
drop { }
}
grok {
match => { "message" => "(?:m) %{TIMESTAMP_ISO8601:timestamp} %{LOGLEVEL:logLevel} - %{POSTFIX_SESSIONID:sessionId}:/%{GREEDYDATA:applicationName} : %{GREEDYDATA:messageText}" }
}
date {
match => ["timestamp", "dd/mm/yyyy:HH:mm:ss:SSS"]
}
}
}
output {
stdout { codec => rubydebug }
if "_grokparsefailure" not in [tags] {
elasticsearch {
hosts => ["127.0.0.1:9200"]
manage_template => false
}
}
}
i applied a grok pattern on message. and i expected to it will create a json field as i want to parse my message.
but it is showing _grokparsefailure error
Hi @magnusbaeck as you point out me on timestamp i changes this in my pattern. and made a multiline check.
where if my line containing timestamp in beginning it will treated as 1 event and if not begin with timestamp then it is part of previous line.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.