How to have logstash update logs as they are coming in from logstash and send to kibana graphs automatically

I want logstash to automatically update and run as new logs are coming in from a web application and send them to kibana's graphs. As of right now I have to manually run logstash and once it finishes running it shows up in kibana. I want it to live parse the logs. How would I go about doing this?

What input are you using?

Tomcat access logs to elastic search, logstash, kibana.

In your logstash configuration, what does the input section look like?

input {
file {
path => ["/route/app/file/tomcat/logs/*"]
start_position => "beginning"
sincedb_path => "NUL"

If you leave logstash running, possibly as a service, it will keep reading lines from the logs as data is appended to files or new files are created.

So it automatically does it?

And once i parse through all the logs and set them in graphs how would I set it so that it only takes new logs created and send it to the graphs instead of starting the whole service again

That is what the sincedb is for. If you remove the sincedb_path option then when it stops it will write out to disk (you do not need to worry where) how much of each file it has read. When it starts again it will continue reading each file from that point.

So when I removed sincedb by logs parse fine but when I had sincedb i get an error and my logs don't parse. It says failed to execute action.

What error, exactly?

I want logstash to parse the data where it left off. So if it parsed all of january's logs already, I want it to start from february since it already parsed the previous logs instead of starting over.

That's what it would normally do with a file input with a sincedb.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.