[2025-03-12T23:41:09,844][INFO ][logstash.outputs.elsticsearch][main][fb620d57153c1bad1d3ce0fa625d11e35cc5c8b4b14994fd6f9c6cba50debda9] Retrying failed action {:status=>429, :action=>["index", {:_id=>nil, :_index=>"example-server-2025.03.10", :routing=>nil}, {"partition"=>"example0", "nodeRole"=>"example-server", "msg"=>"[https-jsse-nio-443-exec-53] commands.PackageManagedAgentFilesCmd 11111111-0000-0000-0000-111111111111 1 DF7166DCCB6B6034AC1FBD9C9CF206DB.route1 - PackageManagedAgentFilesCmd: Exception: masterKey is null\r", "productCode"=>"example-cwc", "event"=>{"original"=>"2025-03-10T16:36:20,354 ERROR [https-jsse-nio-443-exec-53] commands.PackageManagedAgentFilesCmd 11111111-0000-0000-0000-111111111111 1 DF7166DCCB6B6034AC1FBD9C9CF206DB.route1 - PackageManagedAgentFilesCmd: Exception: masterKey is null\r", "hash"=>"a6a2a199c78676c0b8335156e9fa80622b8f9496"}, "log"=>{"level"=>"ERROR", "file"=>{"path"=>"C:/Program Files (x86)/example/policy/Server/logs/example.log.1"}}, "message"=>"2025-03-10T16:36:20,354 ERROR [https-jsse-nio-443-exec-53] commands.PackageManagedAgentFilesCmd 11111111-0000-0000-0000-111111111111 1 DF7166DCCB6B6034AC1FBD9C9CF206DB.route1 - PackageManagedAgentFilesCmd: Exception: masterKey is null\r", "host"=>{"name"=>"cwc-gm-0a2cc5ba"}, "podName"=>"example", "type"=>"example", "@version"=>"1", "@timestamp"=>2025-03-10T16:36:20.354Z}], :error=>{"type"=>"cluster_block_exception", "reason"=>"index [example-server-2025.03.10] blocked by: [TOO_MANY_REQUESTS/12/disk usage exceeded flood-stage watermark, index has read-only-allow-delete block];"}}
input :
file {
path => "C:/Program Files (x86)/example/policy/Server/logs/example.log*"
type => "example"
codec => multiline {
pattern => "^%{TIMESTAMP_ISO8601}"
negate => true
what => "previous"
charset => "ISO-8859-1"
}
start_position => "beginning"
ignore_older => 86400 # ignore files older than 24 hours
close_older => 86400 # free the resources
}
}
question here is. The file which needs to be pushed by logstash to elasticsearch is example.log.
In server in the same location when never the file size reaches to 1G example.log gets rotated to example.log.1 then example.log.2
what should be the file input path?
Why is the above issue happening? why is it reading logs older than 3 days?
please advise.