Hello,
i want ship user logs(java multiline) with filebeat to logstash, but the multiline Events get events from another Event.
MY CONFIG:
The user logs are:
/tmp/user1/application.log
/tmp/user2/application.log
/tmp/user3/application.log
/tmp/user4/application.log
.
.
.
.
and so on
My filebeat ship to logstash:
filebeat:
List of prospectors to fetch data.
prospectors:
# Each - is a prospector. Below are the prospector specific configurations
-
paths:
- /tmp/*/application.log
Logstashserver with beats input and my filter:
multiline {
patterns_dir => "/etc/logstash/conf.d/grok_patter"
pattern => "^%{TIMESSTAMP}"
negate => true
what => "previous"
}
grok {
patterns_dir => "/etc/logstash/conf.d/grok_patter"
match => {
"message" => "%{TIMESSTAMP:logging_time} \| %{LOGLEVEL:level} \| %{DATA:threadname} \| %{JAVACLASS:class} \| %{SPACE} \| %{JAVALOGMESSAGE:logmessage}"
}
add_field => [ "IndexName", "client-application" ]
}
grok {
match => [ "source", "/tmp/(?<userid>[^/]+)/(?<filename>[^/]+)" ]
}
date {
match => [ "logging_time" , "dd MMM yyyy HH:mm:ss,SSS", "MMM yyyy HH:mm:ss,SSS" ]
}
Json Output is:
"source": [
"/tmp/user1/application.log",
"/tmp/user10/application.log",
],
"filename": [
"application.log",
"application.log"
]
"userid": [
"user1",
"user10"
],
What is wrong, why is userid, filename and source duplicated. Any ideas?
br