Hi,
I'm trying to make a date filter for my servers. And i have an error.
Here is the configuration of the shipper :
input {
file {
path => ["/var/log/squid/cache.log"]
type => "squidservice_mobylette"
start_position => end
stat_interval => 1
tags => ["serveur_proxy", "squidservice_mobylette", "squidservice", "mobylette", "RES"]
codec => plain {
charset => "ISO-8859-1"
}
}
}
output {
redis {
host => ["195.221.2.55","195.221.2.56","195.221.2.57"]
data_type => "list"
key => "logstash"
}
}
Everything is correct to me. The shipper send the log to my cluster. I saw that with tcpdump.
Here is the configuration of Logstash on my cluster ELK :
input {
redis {
host => "127.0.0.1"
port => 6379
data_type => "list"
type => "redis-input"
key => "logstash"
}
beats {
port => 5045
type => "filebeat-input"
congestion_threshold => 300
}
}
output {
if "_grokparsefailure" not in [tags] {
if "tango" in [tags] {
elasticsearch {
hosts => ["10.1.101.1", "10.1.102.1", "10.1.103.1"]
index => "logstash-tango-%{+YYYY.MM.dd}"
}
}
if "nginx" in [tags] {
elasticsearch {
hosts => ["10.1.101.1", "10.1.102.1", "10.1.103.1"]
index => "logstash-isg-%{+YYYY.MM.dd}"
}
}
if "scarlette" in [tags] {
elasticsearch {
hosts => ["10.1.101.1", "10.1.102.1", "10.1.103.1"]
index => "logstash-scarlette-%{+YYYY.MM.dd}"
}
}
if "serveur_owncloud" in [tags] {
elasticsearch {
hosts => ["10.1.101.1", "10.1.102.1", "10.1.103.1"]
index => "logstash-owncloud-%{+YYYY.MM.dd}"
}
}
if "brouette" in [tags] or "poussette" in [tags] {
elasticsearch {
hosts => ["10.1.101.1", "10.1.102.1", "10.1.103.1"]
index => "logstash-mta-%{+YYYY.MM.dd}"
}
}
if "serveur_proxy" in [tags] or "serveur_dns" in [tags] {
elasticsearch {
hosts => ["10.1.101.1", "10.1.102.1", "10.1.103.1"]
index => "logstash-proxydns-%{+YYYY.MM.dd}"
}
}
}
}
Here is my filter :
filter {
if "squidservice" in [tags] {
grok {
patterns_dir => ["/etc/logstash/patterns.conf"]
match => ["message","%{SQUIDATE:squid_timestamp} kid1\| %{GREEDYDATA:content}"]
add_tag => "groked"
}
date {
match => ["squidate", "yyyy\/MM\/dd HH:mm:ss"]
timezone => "Europe/Paris"
}
}
}
The log i want to parse is like this :
2016/11/03 11:16:30 kid1| ctx: enter level 0: 'http://hits-i.iubenda.com/write?db=hits1'
2016/11/03 11:16:30 kid1| NOTICE: found double content-length header
2016/11/03 11:24:27 kid1| ctx: exit level 0
2016/11/03 11:24:27 kid1| Logfile: opening log stdio:/var/log/squid/netdb.state
2016/11/03 11:24:27 kid1| Logfile: closing log stdio:/var/log/squid/netdb.state
2016/11/03 11:24:27 kid1| NETDB state saved; 0 entries, 0 msec
Logstash on my cluster return the folowing error :
"error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse [squid_timestamp]", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"Invalid format: "2016/11/03 11:24:27" is malformed at "/11/03 11:24:27""}}}}, :level=>:warn}
I don't know why Logstash can't parse my log date... Do you have an idea ?