How can i get copy or output of parsed logs in logstash on my local disk?
Our Case:
We are thinking not just send logs to Elasticsearch by logstash also to have output from logs on our disk, then we have backup/archive system of our logs.
In case of data loss/ELk failure we dont want to end up with losing some of the important events in our system.
output {
file {
path => ...
codec => line { format => "custom format: %{message}"}
}
}
or Do you prefer second method Elasticsearch indexes' backup?
another question please. If i have log files for two years and i want the logstash just send the last two weeks logs to the elasticsearch, how we can do this?
Hi @magnusbaeck, I tested your suggestion with json_lines and now i have file output on my disk by using following output code:
output {
elasticsearch { hosts => ["localhost:9200"] }
stdout { codec => rubydebug }
file {
path => "/path/backup-logs/BackupSyslog-%{+YYYY-MM-dd}.log" #codec => line { format => "custom format: %{message}"} #codec => "rubydebug"
codec => "json_lines"
}
}
I have two more questions:
1- How to compress this file output by using gzip / Gzip the output stream before writing to disk?
Is it correct to use this
gzip => true
where to use the drop filter? in output section or in filter section?
Please if you have sample script of using drop filter with elasticsearch output!
I have two more questions:
1- How to compress this file output by using gzip / Gzip the output stream before writing to disk?
Is it correct to use this
gzip => true
Yes.
where to use the drop filter? in output section or in filter section?
Filters are always in the filter section.
Please if you have sample script of using drop filter with elasticsearch output!
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.