Hi Magnus! Thanks a lot for your response!!!
At this moment, I have the following files in a "pipeline" directory. The logstash process is started pointing to that directory
02-beats-input.conf 11-sas.conf 30-sas-output.conf
input {
beats {
port => 5044
ssl => false
codec => plain {
charset => "ISO-8859-1"
}
}
}
filter {
if [type] == "sas" {
grok {
match => { "message" => "%{IP:client} - - \[%{HTTPDATE:timestamp}\] \"POST /RTDM/rest/decisions/%{GREEDYDATA:tarificacion} %{DATA:protocol}\" %{NUMBER:code} %{NUMBER:bytes}" }
}
date {
match => ["timestamp", "dd/MMM/yyyy:HH:mm:ss Z"]
locale => "en"
}
}
}
output {
stdout { codec => json_lines }
elasticsearch {
hosts => [
"a.local:9200",
"b.local:9200",
"c.local:9200"
]
sniffing => true
index => "sas-%{+YYYY.MM.dd}"
document_type => "%{[@metadata][type]}"
}
}
Log events are sent from machines, via filebeat, including "sas" as document type
And finally, elasticsearch creates one index per day of the last month and an additional one, for the current day.
I'm sure i'm doing something wrong, but I have no idea xD