I want to filter through this log:
[ ERROR] 02.04.2016. 20:38:19 (FileManagerServlet:handleDownload) Date and time: Sat Apr 02 20:38:19 CEST 2016| miliseconds: 1459622299268| + session id: D4190DFF52C536C500FAF0947DB120DC| userId: 145962184466057
and I'm using the following configuration which on the grok debugger works fine:
[ %{LOGLEVEL:log_level}] %{DATE_EU:date}. %{TIME:time} (%{GREEDYDATA:class}:%{GREEDYDATA:operation}) Date and time: %{GREEDYDATA:full_date_and_time}| miliseconds: %{BASE10NUM:milis}| + session id: %{GREEDYDATA:session_id}| userId: %{BASE10NUM:user_id}
However the outputted result in the terminal I started logstash in gave the following result:
{
"@version" => "1",
"file_name" => "cris_download_log",
"@timestamp" => 2022-02-01T22:05:15.607Z,
"host" => "synnslt6s40663-l",
"path" => "/home/mihailo/Desktop/CRIS_UNS/cris_download_log.log",
"message" => "[ ERROR] 02.04.2016. 20:38:19 (FileManagerServlet:handleDownload) Date and time: Sat Apr 02 20:38:19 CEST 2016| miliseconds: 1459622299268| + session id: D4190DFF52C536C500FAF0947DB120DC| userId: 145962184466057"
}
I want the grok filter to extract the fields and index them in the document, something like this:
{
"@version" => "1",
"file_name" => "cris_download_log",
"@timestamp" => 2022-02-01T22:05:15.607Z,
"host" => "synnslt6s40663-l",
"path" => "/home/mihailo/Desktop/CRIS_UNS/cris_download_log.log",
"logLevel" => "ERROR",
"date" => "02.04.2016",
"time" => "20:38:19",
"class" => "FileManagerServlet",
"operation" => "handleDownload",
"full_date_and_time" => "Sat Apr 02 20:38:19 CEST 2016",
"milis" => 1459622299268,
"session id" => "D4190DFF52C536C500FAF0947DB120DC",
"userId" => 145962184466057
}
my whole logstash-config file looks like this:
Any help would be very welcome!