How to make Logstash write internal logs to a file? (--path.logs fails)

Context

I am building a program that will pass an Apache log file into Logstash and output the result (after parsing and filtering) to an external database (Elastic, MongoDB, etc.).

Basically, the program will execute following command:

gunzip -c -k "somefile.gz" | logstash -f "logstash.conf"

With logstash.conf containing:

input {
    stdin {}
}
filter {
    # ...
}
output {
    mongodb {
        # ...
    }
}

But, to help future debug/forensic, I want to grab whatever Logstash says during it's process.

By default, it seems Logstash outputs it's internal log messages to stdout, as an example, the following command:

logstash -e "input { generator { count => 3 } } output { null {} }"

Outputs:

Sending Logstash's logs to /usr/share/logstash/logs which is now configured via log4j2.properties
[2018-03-28T22:42:53,484][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"fb_apache", :directory=>"/usr/share/logstash/modules/fb_apache/configuration"}
[2018-03-28T22:42:53,515][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"netflow", :directory=>"/usr/share/logstash/modules/netflow/configuration"}
[2018-03-28T22:42:54,222][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2018-03-28T22:42:55,092][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"6.2.3"}
[2018-03-28T22:42:55,795][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}
[2018-03-28T22:42:58,225][INFO ][logstash.pipeline        ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50}
[2018-03-28T22:42:58,407][INFO ][logstash.pipeline        ] Pipeline started succesfully {:pipeline_id=>"main", :thread=>"#<Thread:0x35d85de2 run>"}
[2018-03-28T22:42:58,559][INFO ][logstash.agent           ] Pipelines running {:count=>1, :pipelines=>["main"]}
[2018-03-28T22:42:58,867][INFO ][logstash.pipeline        ] Pipeline has terminated {:pipeline_id=>"main", :thread=>"#<Thread:0x35d85de2 run>"}

Adding --log.level=debug gives at lot more: the same as previous command, plus a lot more, including the following message-related log:

[2018-03-28T22:44:30,671][DEBUG][logstash.pipeline        ] filter received {"event"=>{"@timestamp"=>2018-03-28T22:44:30.567Z, "sequence"=>1, "@version"=>"1", "message"=>"Hello world!", "host"=>"5e26a941934f"}}
[2018-03-28T22:44:30,679][DEBUG][logstash.pipeline        ] output received {"event"=>{"@timestamp"=>2018-03-28T22:44:30.567Z, "sequence"=>1, "@version"=>"1", "message"=>"Hello world!", "host"=>"5e26a941934f"}}
[2018-03-28T22:44:30,680][DEBUG][logstash.pipeline        ] filter received {"event"=>{"@timestamp"=>2018-03-28T22:44:30.568Z, "sequence"=>2, "@version"=>"1", "message"=>"Hello world!", "host"=>"5e26a941934f"}}
...

Lots of noise but it will be useful when tracking down an error.

Problem

I had trouble directly capturing the stdout and stderr of the logstash command in my program, but anyway, I find it cleaner to ask Logstash to directly write to a log file (that I would then read in my program).

According to the documentation adding the --path.logs=/tmp to the logstash command should make it write log file to /tmp but it does not. I get no log file in /tmp after the following command ends (nor during it executes):

logstash \
    --path.logs=/tmp \
    --log.level=debug \
    -e "input { generator { count => 3 } } output { null {} }"

I've also tried the conffiguration file way without much success:

  • added log.level: debug and path.logs: /tmp/logstash-logs into the /etc/logstash/logstash.yml file.
  • created the /tmp/logstash-logs directory (with 777 permissions)
  • and executed logstash -e "input { generator { count => 3 } } output { null {} }"

The logstash outputs confirms it took the changes into account (Sending Logstash's logs to /tmp/logstash-logs which is now configured via log4j2.properties) but still no log files in /tmp/logstash-logs.

So: How make logstash write it's log messages (not the data it processes) to a file?

I can't replicate with Logstash 6.2.2 or Logstash 5.6.8; both correctly send logs to a file in the given directory (although they both also output logs to stdout, which can be confusing).

What version of Logstash are you running, and how did you acquire it? (e.g., did you download a distribution from elastic.co, install with your system's package manager, or are you building/running from source?)

I tried using Docker version v6.2.3 but here are the full updated steps (using latest v6.2.4 Docker) that stills gives me no log files:

Start the Docker container from a first shell :

docker run --rm -it docker.elastic.co/logstash/logstash-oss:6.2.4

Open a Bash shell of this container from an other shell:

docker exec -ti $(docker ps | grep "logstash-oss:6.2.4" | cut -d ' ' -f 1 | head -n 1) /bin/bash

There setup some path and run logstash:

mkdir -p "/tmp/logstash-test/data"
mkdir -p "/tmp/logstash-test/logs"
chmod -R 777 "/tmp/logstash-test"
logstash \
    --path.data=/tmp/logstash-test/data \
    --path.logs=/tmp/logstash-test/logs \
    --log.level=debug \
    -e "input { generator { count => 3 } } output { null {} }"

It gives the following output (full output:

Sending Logstash's logs to /tmp/logstash-test/logs which is now configured via log4j2.properties
[2018-05-02T12:52:28,111][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"fb_apache", :directory=>"/usr/share/logstash/modules/fb_apache/configuration"}
[...]
[2018-05-02T12:52:28,309][INFO ][logstash.setting.writabledirectory] Creating directory {:setting=>"path.queue", :path=>"/tmp/logstash-test/data/queue"}
[2018-05-02T12:52:28,314][INFO ][logstash.setting.writabledirectory] Creating directory {:setting=>"path.dead_letter_queue", :path=>"/tmp/logstash-test/data/dead_letter_queue"}
[2018-05-02T12:52:28,958][DEBUG][logstash.runner          ] -------- Logstash Settings (* means modified) ---------
[2018-05-02T12:52:28,958][DEBUG][logstash.runner          ] node.name: "11ac9116777b"
[2018-05-02T12:52:28,958][DEBUG][logstash.runner          ] *path.data: "/tmp/logstash-test/data" (default: "/usr/share/logstash/data")
[2018-05-02T12:52:28,959][DEBUG][logstash.runner          ] *config.string: "input { generator { count => 3 } } output { null {} }"
[2018-05-02T12:52:28,959][DEBUG][logstash.runner          ] modules.cli: []
[2018-05-02T12:52:28,959][DEBUG][logstash.runner          ] modules: []
[2018-05-02T12:52:28,959][DEBUG][logstash.runner          ] modules_setup: false
[2018-05-02T12:52:28,959][DEBUG][logstash.runner          ] config.test_and_exit: false
[2018-05-02T12:52:28,960][DEBUG][logstash.runner          ] config.reload.automatic: false
[2018-05-02T12:52:28,960][DEBUG][logstash.runner          ] config.reload.interval: 3000000000
[2018-05-02T12:52:28,962][DEBUG][logstash.runner          ] config.support_escapes: false
[2018-05-02T12:52:28,962][DEBUG][logstash.runner          ] metric.collect: true
[2018-05-02T12:52:28,963][DEBUG][logstash.runner          ] pipeline.id: "main"
[2018-05-02T12:52:28,963][DEBUG][logstash.runner          ] pipeline.system: false
[2018-05-02T12:52:28,963][DEBUG][logstash.runner          ] pipeline.workers: 4
[2018-05-02T12:52:28,964][DEBUG][logstash.runner          ] pipeline.output.workers: 1
[2018-05-02T12:52:28,964][DEBUG][logstash.runner          ] pipeline.batch.size: 125
[2018-05-02T12:52:28,964][DEBUG][logstash.runner          ] pipeline.batch.delay: 50
[2018-05-02T12:52:28,964][DEBUG][logstash.runner          ] pipeline.unsafe_shutdown: false
[2018-05-02T12:52:28,964][DEBUG][logstash.runner          ] pipeline.java_execution: false
[2018-05-02T12:52:28,964][DEBUG][logstash.runner          ] pipeline.reloadable: true
[2018-05-02T12:52:28,967][DEBUG][logstash.runner          ] path.plugins: []
[2018-05-02T12:52:28,967][DEBUG][logstash.runner          ] config.debug: false
[2018-05-02T12:52:28,967][DEBUG][logstash.runner          ] *log.level: "debug" (default: "info")
[2018-05-02T12:52:28,967][DEBUG][logstash.runner          ] version: false
[2018-05-02T12:52:28,968][DEBUG][logstash.runner          ] help: false
[2018-05-02T12:52:28,968][DEBUG][logstash.runner          ] log.format: "plain"
[2018-05-02T12:52:28,968][DEBUG][logstash.runner          ] *http.host: "0.0.0.0" (default: "127.0.0.1")
[2018-05-02T12:52:28,968][DEBUG][logstash.runner          ] http.port: 9600..9700
[2018-05-02T12:52:28,968][DEBUG][logstash.runner          ] http.environment: "production"
[2018-05-02T12:52:28,968][DEBUG][logstash.runner          ] queue.type: "memory"
[2018-05-02T12:52:28,968][DEBUG][logstash.runner          ] queue.drain: false
[2018-05-02T12:52:28,968][DEBUG][logstash.runner          ] queue.page_capacity: 67108864
[2018-05-02T12:52:28,968][DEBUG][logstash.runner          ] queue.max_bytes: 1073741824
[2018-05-02T12:52:28,968][DEBUG][logstash.runner          ] queue.max_events: 0
[2018-05-02T12:52:28,968][DEBUG][logstash.runner          ] queue.checkpoint.acks: 1024
[2018-05-02T12:52:28,968][DEBUG][logstash.runner          ] queue.checkpoint.writes: 1024
[2018-05-02T12:52:28,969][DEBUG][logstash.runner          ] queue.checkpoint.interval: 1000
[2018-05-02T12:52:28,969][DEBUG][logstash.runner          ] dead_letter_queue.enable: false
[2018-05-02T12:52:28,969][DEBUG][logstash.runner          ] dead_letter_queue.max_bytes: 1073741824
[2018-05-02T12:52:28,969][DEBUG][logstash.runner          ] slowlog.threshold.warn: -1
[2018-05-02T12:52:28,969][DEBUG][logstash.runner          ] slowlog.threshold.info: -1
[2018-05-02T12:52:28,969][DEBUG][logstash.runner          ] slowlog.threshold.debug: -1
[2018-05-02T12:52:28,969][DEBUG][logstash.runner          ] slowlog.threshold.trace: -1
[2018-05-02T12:52:28,969][DEBUG][logstash.runner          ] keystore.classname: "org.logstash.secret.store.backend.JavaKeyStore"
[2018-05-02T12:52:28,969][DEBUG][logstash.runner          ] keystore.file: "/usr/share/logstash/config/logstash.keystore"
[2018-05-02T12:52:28,969][DEBUG][logstash.runner          ] *path.queue: "/tmp/logstash-test/data/queue" (default: "/usr/share/logstash/data/queue")
[2018-05-02T12:52:28,969][DEBUG][logstash.runner          ] *path.dead_letter_queue: "/tmp/logstash-test/data/dead_letter_queue" (default: "/usr/share/logstash/data/dead_letter_queue")
[2018-05-02T12:52:28,969][DEBUG][logstash.runner          ] path.settings: "/usr/share/logstash/config"
[2018-05-02T12:52:28,969][DEBUG][logstash.runner          ] *path.logs: "/tmp/logstash-test/logs" (default: "/usr/share/logstash/logs")
[...]

And ls -l /tmp/logstash-test/* outputs:

/tmp/logstash-test/data:
total 12
drwxr-xr-x 2 logstash logstash 4096 May  2 12:52 dead_letter_queue
drwxr-xr-x 2 logstash logstash 4096 May  2 12:52 queue
-rw-r--r-- 1 logstash logstash   36 May  2 12:52 uuid

/tmp/logstash-test/logs:
total 0

Do I have to configure Log4j (https://www.elastic.co/guide/en/logstash/6.2/logging.html) for internal logging to work?

Within your docker container, are you logging to a location that has been exposed to your host OS? AFAIK, docker containers run with their own /tmp that is not related to the host OS' folder of the same address.

No volume mounting involved and the ls commands I ran where run into the Docker container.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.