Logstash log separation per pipeline doesn't work!

Hi guys.

We use ELK version 8.4.2.
on logstash, we have 15 pipelines. Now we need to separate the logs.
There is a solution in the elastic document:

This document says to insert 2 directives in logstash.yml that automatically separate logs in separate files per pipeline:

path.logs: LOGSTASH_HOME/logs

pipeline.separate_logs: true

and this is my logstash.yml

## Default Logstash configuration from Logstash base image.
## https://github.com/elastic/logstash/blob/master/docker/data/logstash/config/logstash-full.yml
#
http.host: "0.0.0.0"
node.name: "coz-logstash"
xpack.monitoring.elasticsearch.hosts: [ "${ELASTIC_HOST}" ]

## X-Pack security credentials
#
xpack.monitoring.enabled: true
xpack.monitoring.elasticsearch.username: ${ELASTIC_ROOT_USER}
xpack.monitoring.elasticsearch.password: ${ELASTIC_ROOT_PASS}
api.auth.type: basic
api.auth.basic.username: ${ELASTIC_ROOT_USER}
api.auth.basic.password: ${ELASTIC_ROOT_USER}

config.reload.automatic: true
pipeline.separate_logs: true
path.logs: "/usr/share/logstash/log/"

these settings don't write any log file in the path.logs directory.
Are there any other settings that I have to set?

I change permissions to 777 (for testing) but still, there isn't any log file.

can anyone help me?

Have you set pipeline.id?

If enabled Logstash will create a different log file for each pipeline, using the pipeline.id as name of the file.

Hi @Rios.
yes I set the pipeline.id .
After 3, or 4 days doesn't exist any log file.

This is my pipelines.yml :

- pipeline.id: ${PIPEID_COZ_PURCHASE_NOZOMI}
  path.config: "./pipeline/${PIPEID_COZ_PURCHASE_NOZOMI}/*.conf"
  pipeline.workers: 2

- pipeline.id: ${PIPEID_COZ_PURCHASE_DSELL}
  path.config: "./pipeline/${PIPEID_COZ_PURCHASE_DSELL}/*.conf"
  pipeline.workers: 2

- pipeline.id: ${PIPEID_COZ_BATTLE_SERVER}
  path.config: "./pipeline/${PIPEID_COZ_BATTLE_SERVER}/*.conf"

- pipeline.id: ${PIPEID_COZ_USER_STATIC}
  path.config: "./pipeline/${PIPEID_COZ_USER_STATIC}/*.conf"

The pipeline log naming is working perfectly on my side.

You should add $ at the begging path.logs : ${LOGSTASH_HOME}/config
Also, it's useful to set temporarily to see LS settings values:
config.debug: true
log.level: debug

@Rios Where can I see the results of these setting?

config.debug: true
log.level: debug

The command/terminal line.

Also, check that /etc/logstash/log4j.properties contains the routing appender

appender.routing.type = PipelineRouting
appender.routing.name = pipeline_routing_appender
appender.routing.pipeline.type = RollingFile
appender.routing.pipeline.name = appender-${ctx:pipeline.id}
appender.routing.pipeline.fileName = ${sys:ls.logs}/pipeline_${ctx:pipeline.id}.log
...

If you upgraded an old install to 8.4.2 it is possible that you have a log4j2.properties.rpmnew with that feature and a log4j2.properties without.

2 Likes

Useful note, thanks Badger. I usually upgrade with new yml/properties, just append modified values from older configuration files.

@Rios @Badger

I implemented options that you dear guys wrote, and checked several times but none of them doesn't work,

We run the logstash on docker and I found out we handle logging in docker-compose by these setting:

x-logging:
  &logging-default
  driver: "json-file"
  options:
    max-file: "5"
    max-size: "20m"

services:
  logstash:
    container_name: docker_logstash
    build:
      context: conf/coz/
      args:
        ELK_VERSION: '8.4.2'
    env_file:
      - ./conf/coz/logstash.env
    logging: *logging-default
    ports:
      - 9600:9600
      - 5044:5044

Could this setting have conflicted?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.