LogStash multiple files


(Shashi) #1

I am restarting this question as previously i was moved to a different task so now back on this.

Logstash works for me when i have a direct file reference -
When I use the single file reference in the logstash config-

input {
file {
path => "C:\Tools\LogRepository\transaction\Web\http\User001_HTTPS1.csv"
type => "transactions"
start_position => "beginning"
}
}

It works and all entries from the file - User001_HTTPS1.csv are indexed into Elastic.
However when I put in a generic reference -

input {
file {
path => "C:\Tools\LogRepository\transaction\Web\http*.csv"
type => "transactions"
start_position => "beginning"
}
}

None of the files are picked up.

Problem for me is that this location contains individual user transaction data and I cannot say how many files or specifically the name of the files in this location. - this is a direct data dump.
the "Web" folder contains multiple folders - "http", "MQ", "FTP" etc.
each of these folders contains a variable number of files.
So I am wondering if logstash file filter can support a tree structure, i.e. directory containing directories containing file(s).


(Shashi) #2

My logstash conf file is below.

input {
file {
path => "C:\Shashi\Tools\LogRepository\transaction\Web***.csv"
type => "transactions"
start_position => "beginning"
}
}

filter {
csv {
separator => ","
columns => ["Username", "InterfaceType", "UserType", "MessageType",
"MessageTemplate", "StartTimeinMillis", "EndTimeinMillis",
"StartTime", "EndTime"]
remove_field => [ "path", "message" ]

}

date {
match => [ "StartTime" , "dd-MMM-yyyy HH:mm:ss" ]
}
}

output {
elasticsearch {
hosts => ["localhost:9200"]
index => "transaction_data"
}
stdout {}
}


(Magnus B├Ąck) #3

Have you tried using forward slashes instead of backslashes?


(system) #4

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.