Filebeat multiline pattern for date format 'not working'!


(brouk) #1

Hi Guys, I have a problem with my multiline pattern in filebeat, my date format for Log-file p_icn looks like [6/13/18 8:11:25:022 CEST] and I used these patterns '^\[[0-9]{1}/[0-9]{2}/[0-9]{2}' but it does not parset anything.
the other Log-file p_test has a date format like this 2018-04-17T15:19:20.313and i used this multiline pattern '^[0-9]{4}-[0-9]{2}-[0-9]{2}' and it's work fine.
can someone help me with the problem for the p-icn log-file please.
my filebeat conf:

           filebeat.prospectors:
- type: log
  paths:
    - /home/AA/Dev/logs/p_test.log
  multiline.pattern: '^[0-9]{4}-[0-9]{2}-[0-9]{2}'
  multiline.negate: true
  multiline.match: after

- type: log
  paths:
    - /home/AA/Dev/logs/p_icn.log
 
  multiline.pattern: '^\[[0-9]{1}/[0-9]{2}/[0-9]{2}'
  multiline.negate: true
  multiline.match: after

output.logstash:
    hosts: ["localhost:5044"]

(Jaime Soriano) #2

Hi @bab,

For the example you mention the pattern should work, but it won't work for all cases, notice that the part to match the month looks for a number of length 1 ([0-9]{1}), you probably need to replace it with 2 (^\[[0-9]{2}/[0-9]{2}/[0-9]{2}').

If this is not the problem, could you share also an example of one of these multiline logs?


(brouk) #3

Thank you @jsoriano for your prompt reply
I tried to change the lenght to 2 but anyway it's parsed nothing in elasticsearch and on stdout I get this warning :

[2018-07-04T17:30:11,540][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"index_3", :_type=>"doc", :_routing=>nil}, #<LogStash::Event:0x5c260b74>], :response=>{"index"=>{"_index"=>"index_3", "_type"=>"doc", "_id"=>"eyPqZWQBAzFCu_yrTb-L", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse [date]", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"Invalid format: \"6/13/18 8:11:25:022\" is malformed at \"/13/18 8:11:25:022\""}}}}}

example of the multiline logs:

[6/13/18 8:11:25:022 CEST] 00000032 SystemOut     O CIWEB Perf : [ ] com.ibm.ecm.configuration.DatabaseConfiguration._getProperty() Retrieving the configuration object for cfgKey = interfaceTextLabel.navigator.sys_CurrentState
[6/13/18 6:07:53:875 CEST] 0000005f SystemOut     O CIWEB Error: [myz5cyq(unknown) @ 10.174.12.221] com.ibm.ecm.struts.actions.p8.P8RetrieveItemsAction.executeAction()
com.filenet.api.exception.EngineRuntimeException: FNRCE0051E: E_OBJECT_NOT_FOUND: Das angeforderte Element wurde nicht gefunden. Objektidentität: classId=VersionSeries&objectId={903C4862-0000-C2$
        at com.filenet.engine.retrieve.IndependentClassRetriever.getObject(IndependentClassRetriever.java:650)
        at com.filenet.engine.retrieve.IndependentClassRetriever.getObject(IndependentClassRetriever.java:362)

(Jaime Soriano) #4

Now that I can see the error, I'd say that the multiline is working fine, but there is something adding a field that cannot be parsed, what configuration do you have in logstash? If you are parsing dates there, you may need a different pattern for both formats.


(brouk) #5

my logstash conf loos like this :

input {
    beats {
        port => 5044  }
}
filter {

# Filenet P8 server error log pattern
 if [fields][log_type] == "p8_server_error" {
  grok {
   match => [ "message",
             "%{TIMESTAMP_ISO8601:date} %{DATA:thread} %{DATA:sub} [ ]* %{DATA:category} \- %{LOGLEVEL:sev} %{GREEDYDATA:message}" ]
  overwrite => [ "message" ]
  }
  mutate {
    replace => [ "type", "p8_server_error_log" ]
    }
 }
# CPN JVM log pattern
 if [fields][log_type] == "SystemOut-CPE-JVM" {
  grok {
   match => [ "message",
             "%{DATESTAMP:date} %{DATA} %{DATA:thread} %{DATA:java-class} %{DATA:sev} %{DATA:java-package} %{DATA:java-method} %{GREEDYDATA:message}" ]
   overwrite => [ "message" ]
  }
  mutate {
    replace => [ "type", "SystemOut-CPE-JVM_log" ]
    }
 }

date {
   match => [ "date", "yyyy-MM-dd'T'HH:mm:ss.SSS", "M/dd/YY HH:mm:ss:SSS", "M/d/YY HH:mm:ss:SSS", "MM/d/YY HH:mm:ss:SSS", "ISO8601" ]
   }
}


output {
 #stdout { codec => rubydebug }

    elasticsearch {
        hosts => ["localhost:9200"]
            index => "index_1"
      }
}

@jsoriano can you see where is the problem with the date filter? i can't find it!!
I wrote many different date format on date filter to parse all date from log-event Time


(Jaime Soriano) #6

@bab according to documentation, year is represented with lowercased yyyy for full year or yy for two digit representation, could you try to lowercase them in your patterns?


(brouk) #7

Hallo @jsoriano
thanks for your help i solved the Multiline problem as u said, it was not an multiline rather on date filter i do this change on date filter :

date {
   match => [ "date", "M/dd/yy HH:mm:ss.SSS", "yyyy-MM-dd'T'HH:mm:ss.SSS", "ISO8601" ]
   target => "date"
   locale => "en"
   timezone => "UTC"
   }

and i don't get anymore the issue BUT in Kibana when i want to visualize the Data with date i can't find my filed logdate . my question is how i can parse both of date [6/13/18 8:11:25:022 CEST] & 2018-04-17T15:19:20.313 in my field logdate dor kibana visializing?


(Jaime Soriano) #8

Once the date is parsed, it is stored in the field specified in target, or in @timestamp if no target is specified.


(system) #9

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.