I think that there is a problem in my multi line pattern but I don't understand where
Here is my codec :
file {
path => "/var/log/all_logs/**/serverlogs/localhost.*.log"
start_position => "beginning"
sincedb_path => "/dev/null"
codec => multiline {
pattern => "^\d{2}-\d{3}-\d{4}"
negate => true
what => "previous"
}
type => "localhost"
}
Here are my date format :
"time": {
"type": "date",
"format": "yyyy-MM-dd HH:mm:ss.SSS||yyyy-MM-dd HH:mm:ss||dd-MMM-yyyy HH:mm:ss.SSS",
"fields": {
"keyword": {
"type": "keyword",
"ignore_above": 256
}
}
},
I can't send the full error because the message of the log is MASSIVE (like the log that I must send took thousand of line )
here is the start of the error :
[2023-04-20T13:55:33,562][WARN ][logstash.outputs.elasticsearch][main][619efc3cfdee97950b75b845b14a813a73e06f3ce7b8760276b342dbcaa01169] Could not index event to Elasticsearch. status: 400, action: ["index", {:_id=>nil, :_index=>"test_index2", :routing=>nil}, {"event"=>{"original"=>"\t\tat org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:306)\n\t\tat org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:230)\n\t\tat org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:302)\n\t\tat org.springframework.beans.factory.support.AbstractBeanFactory.getTypeForFactoryBean(AbstractBeanFactory.java:1469)\n\t\tat org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.getTypeForFactoryBean(AbstractAutowireCapableBeanFactory.java:808)\n\t\tat org.springframework.beans.factory.support.AbstractBeanFactory.isTypeMatch(AbstractBeanFactory.java:544)\n\t\tat org.springframework.beans.factory.support.DefaultListableBeanFactory.doGetBeanNamesForType(DefaultListableBeanFactory.java:447)\n\t\t
But in the docker container of logstash here is the end of it :
"log"=>{"file"=>{"path"=>"/var/log/all_logs/eures-batch/serverlogs/localhost.2023-02-23.log"}}, "@version"=>"1", "type"=>"localhost", "@timestamp"=>2023-04-20T13:35:34.037120292Z}], response: {"index"=>{"_index"=>"test_index2", "_id"=>"aaHgnocB-zc9hc34owwj", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse field [time] of type [date] in document with id 'aaHgnocB-zc9hc34owwj'. Preview of field's value: '%{day_localhost}-%{month_localhost}-%{year_localhost} %{time_localhost}'", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"failed to parse date field [%{day_localhost}-%{month_localhost}-%{year_localhost} %{time_localhost}] with format [yyyy-MM-dd HH:mm:ss.SSS||yyyy-MM-dd HH:mm:ss||dd-MMM-yyyy HH:mm:ss.SSS]", "caused_by"=>{"type"=>"date_time_parse_exception", "reason"=>"Failed to parse with all enclosed parsers"}}}}}
And the problem if from the start of the message that doesn't start with the date but it should normally be the case because of the multi line pattern no ? Is there a limit for the size of the logs because this one is realllyyyyyyyyy long