Hi,
I have a local log file which is updated constantly. I have the following input filter in place:
file
{
path => ["/home/joseph/Desktop/audit.log"]
start_position => "beginning"
sincedb_path => "/dev/null"
exclude => "*.gz"
}
However, since I set the start_position to beginning, the entire file is read each time the file is updated. Is it possible for me to configure logstash to only ship the new logs ( by tailing) to Elasticsearch?