Logstash pipeline processing

Hello,
I am having a problem with Logstash configuration. I can't get rid with pipelines configuraton. My scenario:

logstash.yml - only lines which are uncommented/changed

path.data: /usr/share/logstash
pipeline.workers: 4
pipeline.batch.size: 1536
log.level: error
path.logs: /var/log/logstash

pipelines configuration:

- pipeline.id: files_log
  path.config: "/etc/logstash/pipelines/gzipped_data/01-checked_files.conf"
  pipeline.workers: 1
- pipeline.id: access_log
  path.config: "/etc/logstash/pipelines/access/02-access_log.conf"
  pipeline.workers: 2

First I thought that the problem occured on pipe configuration, but now it looks like problem with workers: my logstash has 4 workers: when logstash starts took one worker for files_log and two workers for access_log. Somehow (I thought workers stay always pinned to pipelines) when new file in files_log appears logstash can't assign worker to pipeline and I must restart logstash process to workers re-assign.

Where I made a mistake on config? I can't find the error alone :\

Tanks & kind regards,
Karl

If you set that in logstash.yml then any pipelines that do not have a pipeline-specific setting for pipeline.workers will have 4 worker threads. Since both your pipelines do have a pipeline-specific setting it really has no effect.

What does 01-checked_files.conf look like?

Hello,
Thanks for replay.
Here is my configuration (input section with details, filter ans output are probably without issues (I didn't notice any problems with filter or saving logs in ES):

input {
  file {
    path => "/data/files-log/*.gz"
    sincedb_path => "/data/files-log.db"
    sincedb_clean_after => 5
    start_position => "beginning"
    mode => "read"
    file_completed_log_path => "/var/log/logstash/logstash-files-log_done.log"
    file_completed_action => "log_and_delete"
    exit_after_read => "true"
    type => "fileslog"
  }
}

filter {
  if [type]== "fileslog" {
    grok { ...}
    mutate { ... }
  }
}

output {
#  stdout { codec => rubydebug }
  if [type] == "fileslog" {
    elasticsearch { ... }
  }
}

Thank you
Kind regards,
Karl Wolf

I suspect that you are not understanding what this does. The documentation is here.

Hello Badger,
Thanks for your answer. I deleted
exit_after_read => "true"
and looks better. .. but is not working fully right.
Right now after restart: logstash proceeded over files which was on the folder and waited for the new one.
After new file transfer Logstash processed three of six files (leave three without parsing). After next transfer (5 files) took again three of them. So it is better but is not ideal.

I don't get it why is not determistic...
Regards,
Karl Wolf

That sounds a lot like inode re-use. Read this thread.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.