Good night,
I've use the dev tools for deploy the grook rule for squid logs
The config file is the following
input {
file {
path => "/var/elastik/access.log"
start_position => "beginning"
}}
filter {
grok {match => [ "message","%{NUMBER:timestamp}%{SPACE}%{NUMBER:duration}\s%{IP:client_address}\s%{WORD:cache_result}/%{POSINT:status_code}\s%{NUMBER:bytes}\s%{WORD:request_method}\s%{NOTSPACE:url}\s%{NOTSPACE:user}\s%{WORD:hierarchy_code}/%{NOTSPACE:server}\s%{NOTSPACE:content_type}" ] } date { match => [ "timestamp", "UNIX" ] remove_field => [ "timestamp" ] } }
output {
elasticsearch {
"hosts" => "localhost:9200"
"index" => "squid"
}
stdout { codec => json_lines }
}
The input in the grok dev tool
1616454089.572 259 172.25.157.81 TCP_TUNNEL/200 5630 CONNECT api-eu1.xbc.trendmicro.com:443 - HIER_DIRECT/18.156.104.89 -
And the result is the following
{
"server": "18.156.104.89",
"status_code": "200",
"hierarchy_code": "HIER_DIRECT",
"request_method": "CONNECT",
"url": "api-eu1.xbc.trendmicro.com:443",
"duration": "259",
"content_type": "-",
"bytes": "5630",
"cache_result": "TCP_TUNNEL",
"client_address": "172.25.157.81",
"user": "-",
"timestamp": "1616454089.572"
}
in the deploy into the kibana appears the following error, any idea?
tags _grokparsefailure and the fiels defined in the index are incorrect