Reading Old Log files


I have ELK Server setup and the logstash keeps reading the old files along with the new ones.
How can I prevent that?

Can we see your config?

Yes sure:

Input Config :

input {
lumberjack {
port => 5043
type => "logs"
codec => "json"
ssl_certificate => "/etc/pki/tls/certs/logstash-forwarder.crt"
ssl_key => "/etc/pki/tls/private/logstash-forwarder.key"

filter {
grok {
match => ["message","(?[0-9.]{1,6}) %{WORD:method} %{TIMESTAMP_ISO8601:RequestTime} %{URIPATHPARAM:RequestUrl} %{TIMESTAMP_ISO8601:RequestTime} %{NUMBER:StatusCode} %{DATA:level}"]
tag_on_failure => []

output {
elasticsearch { hosts => ["localhost:9200"]
stdout { codec => rubydebug }


So how are your logs getting sent to LS? Cause there isn't anything there that would indicate why this may happen.

oh sorry! The logs get into logstash using logstash forwarder.

Ok, but you missed my point :slightly_smiling:

How do logs get to Logstash?

Ohh sorry! The logs get into logstash using LSF (logstash forwarder).

Thats the logstash-forwarder conf at the client:
"network": {
"servers": [ "log-server:5043" ],

"ssl certificate": "/etc/logstash/logstash-certs/logstash-forwarder.crt",
"ssl ca": "/etc/logstash/logstash-certs/logstash-forwarder.crt",

"timeout": 15
"files": [
"paths": [ "/opt/logs/*.log"],
"fields": { "type": "log","environment": "production" }
"paths": [ "/var/log/syslog"],
"fields": { "type": "syslog" }

Ok, LSF is deprecated, you should use filebeat.

That aside, I don't see a problem with your config, do things get moved aside on the FS?

Yes. I have migrated few servers from LSF to Filebeat. But I have one doubt, since I have around 100+ servers all sending logs to one single server. What is the best way to scale out?
And will this issue go away completely with Filebeat?
Andyes we are using logrotate to move the old logs.

We recommend the use of a broker for that -

Alright @warkolm thanks a lot.