Odd behavior when logstash finished pipeline

I have two pipelines defined in pipelines.yml: one for my main pipeline which imports data from jdbc input, and another for the dead_letter_queue input.

When I run Logstash when it gets to the end of the database, it seems to stop on the last database query and doesn't import anything else. I thought Logstash was supposed to end the pipeline and restart after it was done or something. Is that not the normal behavior?

Logstash wants to run forever processing streams of events.
Some inputs set up a server like process that waits (forever) for clients to connect.
Some inputs read from local or networked (S3) folders in a loop looking for new files to read or more content in found files (tail).
Some inputs query message brokers in a loop fetching data from topics.
Some inputs use cron like schedules to periodically read from a source.
Only one input generator will run for a certain time and then shutdown Logstash.

Due to continual streaming, the concept of reaching the end of a pipeline only applies when the generator input is used.

So for JDBC input is it best practice to use the schedule parameter and not just let it run?

In most cases, yes.

However if you want to use the elasticsearch filter to do enhancement on another pipeline and you choose to load the records from a static reference DB table into ES then you would only run this "loader" pipeline once.

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.