Hi guys.
I am running Logstash version 8.8.2 on docker with more than 10 pipelines.
I want to separate pipeline logs to separate log files, according to logstash document:
document said set path.logs and pipeline.separate_logs in logstash.yml to separate pipeline logs.
I set those parameters but not working.
are there any other options to set? Note: I want to separate the logs that the container shows not the pipeline itself output or result.
http.host: "0.0.0.0"
node.name: "logstash-coz-ls-transaction"
xpack.monitoring.elasticsearch.hosts: [ "${ELASTIC_HOST}" ]
monitoring.enabled: false
monitoring.cluster_uuid: "${ELASTIC_CLUSTER_UUID}"
## X-Pack security credentials
#
xpack.monitoring.elasticsearch.username: ${ELASTIC_ROOT_USER}
xpack.monitoring.elasticsearch.password: "${ELASTIC_ROOT_PASS}"
api.auth.type: basic
api.auth.basic.username: ${ELASTIC_ROOT_USER}
api.auth.basic.password: "${ELASTIC_ROOT_PASS}"
allow_superuser: true
config.reload.automatic: true
# This value set because of this warning: The default value of `api.auth.basic.password_policy.mode` will change to `ERROR` in a future release of Logstash. Set it to `ERROR` to observe the future behavior early, or set it to `WARN` to lock in the current behavior.
api.auth.basic.password_policy.mode: ERROR
path.logs: "/usr/share/logstash/logs"
pipeline.separate_logs: true
log.level: info
So, can you provide some context of what is not working? Like, share some of the logs you are getting.
You said that you have more than 10 pipelines, but in your pipelines.yml you have just 2 pipelines, so you should get only 2 log files, one for each pipeline id after changing the pipeline.separate_logs setting and restarted logstash.
everything works fine, pipelines too. but Log files have not been created.
I don't have any problem with reading logs or errors. this is the output of docker compose logs -f :
logstash-competition | [2023-07-03T13:49:31,280][INFO ][logstash.inputs.jdbc ][coz-ls-pipe-battlelog-special][ca1ad565781114828ae3376f73c8954e387ca45e3bad51ab2dd177af988256bb] (0.690862s) SELECT count(*) AS `count` FROM (SELECT nozomi_clan_hero.id AS user_id,
logstash-competition | nozomi_clan_hero.hid AS attack_hero_id,
logstash-competition | nozomi_clan_hero.ctime AS attack_time,
logstash-competition | nozomi_clan_hero.btime AS attack_purchase_time,
logstash-competition | nozomi_clan_hero.htime AS attack_reward_collection_time,
logstash-competition | nozomi_clan_hero.anum AS attack_count,
logstash-competition | nozomi_clan_hero.buys AS attack_count_purchase,
logstash-competition | nozomi_clan_hero.hrwd AS attack_reward,
logstash-competition | nozomi_clan_hero.anum - nozomi_clan_hero.buys AS attack_count_free,
logstash-competition | crystal_table.value AS user_crystal,
logstash-competition | chip_table.value AS user_chip
logstash-competition |
logstash-competition | FROM nozomi_clan_hero
logstash-competition | LEFT JOIN user_properties AS crystal_table
logstash-competition | ON ( nozomi_clan_hero.id = crystal_table.id AND crystal_table.pid = 4 )
logstash-competition |
logstash-competition | LEFT JOIN user_properties AS chip_table
logstash-competition | ON ( nozomi_clan_hero.id = chip_table.id AND chip_table.pid = 7 )
logstash-competition |
logstash-competition | WHERE nozomi_clan_hero.ctime > 1688392144 AND nozomi_clan_hero.ctime <= UNIX_TIMESTAMP(NOW())
logstash-competition |
logstash-competition | ORDER BY nozomi_clan_hero.ctime ASC) AS `t1` LIMIT 1
yes, I have more than 10 pipelines but on other containers.
Since I separated containers, debugging got easier. but I couldn't separate logs yet.
my problem is log files didn't create.
I do not use Docker, but looking how the container is built in the elastic github, it seems that it per default will log to stdout, no matter if you have set path.logs or not, this is mentioned in the documentation.
Looking at the docker-entrypoint file, it seems to use an environment variable to change this behavior, setting $LOG_STYLE as file would tell it to log to files, but this is not mentioned in the documentation.
I think that if you provide this environment variable with the value of file it will log to files, but I'm not sure since I do not use docker.
You may try it or wait for someone from Elastic to provide more information.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.