Logging form folder with subfolders

Hello!
I have the next structure of logs
/var/log/stash/site1/.log
/var/log/stash/site1/accounts/account1/
.log
/var/log/stash/site1/otherKind1/.log
/var/log/stash/site1/otherKind2/
.log
so in site can be many subfolders
same site2, site3 etc

How I should use filter grok or others to get in elasticsearch output in index all this subfolders: site1,2,3 (name of sites here), "accounts" and then account names
%{site}
%{site}-%{kind}
%{site}-account-%{account}

Trying to do like this:
input {
file {
path => [ "/var/log/stash/**/.log" ]
type => "file"
}
}
filter {
grok {
match => {"path" => "/var/log/stash/(?[^/]+)/(?[^/]+)/
.log"}
}
grok {
match => {"path" => "/var/log/stash/(?[^/]+)/*.log" }
}
...
}
output {
if [type] == "file" {
elasticsearch {
hosts => "127.0.0.1:9200"
index => "logstash-%{site}-%{kind}-%{+YYYY.MM.dd}"
}
}
stdout { codec => rubydebug }
}

But no results

Are the site and kind fields being populated, i.e. are the grok filters working? Forget about the elasticsearch output for now, use the stdout { codec => rubydebug } output while debugging.

Magnus Bäck, thanks. I found problem and solution

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.