I am restarting this question as previously i was moved to a different task so now back on this.
Logstash works for me when i have a direct file reference -
When I use the single file reference in the logstash config-
input {
file {
path => "C:\Tools\LogRepository\transaction\Web\http\User001_HTTPS1.csv"
type => "transactions"
start_position => "beginning"
}
}
It works and all entries from the file - User001_HTTPS1.csv are indexed into Elastic.
However when I put in a generic reference -
input {
file {
path => "C:\Tools\LogRepository\transaction\Web\http*.csv"
type => "transactions"
start_position => "beginning"
}
}
None of the files are picked up.
Problem for me is that this location contains individual user transaction data and I cannot say how many files or specifically the name of the files in this location. - this is a direct data dump.
the "Web" folder contains multiple folders - "http", "MQ", "FTP" etc.
each of these folders contains a variable number of files.
So I am wondering if logstash file filter can support a tree structure, i.e. directory containing directories containing file(s).