Having file and elasticsearch output with filtering only the elastic search

Hi,

I was hoping to use the ELK stack to take over from an old scp and regex based log centralization we have running. Right now all in house application logs are collecting at a central location using a remote ssh call that gzips the logs, ships them to the central server and creates a path based on application name, date, hostname. There are a number of inhouse applications that search through that log data in it's present location. The data is not filtered.

What I was hoping was we could send the logs to logstash, have logstash output them to the desired location based on appname, date, filename. Then we could also send the data to elastic search.

What a see as a problem is if I have the data filtered then using the output file plugin will lead to outputting filtered data where i want to output unfiltered data.

in hints or interesting work arounds would be appreciated.

-S

You could run multiple Logstash instances or use the clone filter to split each event in two and apply the filtering selectively.