Logstash not showing output

Hey guys,

So I'm just trying to create a simple pipeline in logstash to produce a simple output in the 'output' file based on data entered in the 'input' file.

The pipeline seems to be fine with no errors, but it's not producing anything in the output file.

Is anyone able to give me some insight on this issue?

Thanks!

1 Like

Use forward slash instead of backslash in the path option for the input.

ignore_older => 0 says to ignore any files more than zero seconds old. That means it ignores all files. Delete that line.

If those two changes do not help then enable --log.level trace and see what filewatch has to say.

I did the trace, and it seems to be able to process the input file but it seems to have trouble producing the output.

I set the part of the log that I saw was causing the issue to bold letters. After doing some research, it seems like people were having success with using earlier versions of Logstash. One of the updates seemed to have broke it?

[2019-04-01T11:42:34,844][DEBUG][logstash.instrument.periodicpoller.cgroup] One or more required cgroup files or directories not found: /proc/self/cgroup, /sys/fs/cgroup/cpuacct, /sys/fs/cgroup/cpu

[2019-04-01T11:42:33,523][DEBUG][logstash.plugins.registry] On demand adding plugin to the registry {:name=>"file", :type=>"output", :class=>LogStash::Outputs::File}
[2019-04-01T11:42:33,617][DEBUG][logstash.plugins.registry] On demand adding plugin to the registry {:name=>"json_lines", :type=>"codec", :class=>LogStash::Codecs::JSONLines}
[2019-04-01T11:42:33,648][DEBUG][logstash.codecs.jsonlines] config LogStash::Codecs::JSONLines/@id = "json_lines_82f6f910-ee17-4bf9-a001-53352efeb3d3"
[2019-04-01T11:42:33,663][DEBUG][logstash.codecs.jsonlines] config LogStash::Codecs::JSONLines/@enable_metric = true
[2019-04-01T11:42:33,663][DEBUG][logstash.codecs.jsonlines] config LogStash::Codecs::JSONLines/@charset = "UTF-8"
[2019-04-01T11:42:33,663][DEBUG][logstash.codecs.jsonlines] config LogStash::Codecs::JSONLines/@delimiter = "\n"
[2019-04-01T11:42:33,679][DEBUG][logstash.outputs.file    ] config LogStash::Outputs::File/@path = "C:\\LSData\\Output\\logstasho.log"
[2019-04-01T11:42:33,679][DEBUG][logstash.outputs.file    ] config LogStash::Outputs::File/@id = "9ad1bd3faeb58276426561ff271d3f29c912391a0f8e3027011e12692b56cd32"
[2019-04-01T11:42:33,679][DEBUG][logstash.outputs.file    ] config LogStash::Outputs::File/@enable_metric = true
[2019-04-01T11:42:33,679][DEBUG][logstash.outputs.file    ] config LogStash::Outputs::File/@codec = <LogStash::Codecs::JSONLines id=>"json_lines_82f6f910-ee17-4bf9-a001-53352efeb3d3", enable_metric=>true, charset=>"UTF-8", delimiter=>"\n">
[2019-04-01T11:42:33,679][DEBUG][logstash.outputs.file    ] config LogStash::Outputs::File/@workers = 1
[2019-04-01T11:42:33,679][DEBUG][logstash.outputs.file    ] config LogStash::Outputs::File/@flush_interval = 2
[2019-04-01T11:42:33,679][DEBUG][logstash.outputs.file    ] config LogStash::Outputs::File/@gzip = false
[2019-04-01T11:42:33,679][DEBUG][logstash.outputs.file    ] config LogStash::Outputs::File/@filename_failure = "_filepath_failures"
[2019-04-01T11:42:33,679][DEBUG][logstash.outputs.file    ] config LogStash::Outputs::File/@create_if_deleted = true
[2019-04-01T11:42:33,679][DEBUG][logstash.outputs.file    ] config LogStash::Outputs::File/@dir_mode = -1
[2019-04-01T11:42:33,679][DEBUG][logstash.outputs.file    ] config LogStash::Outputs::File/@file_mode = -1
[2019-04-01T11:42:33,679][DEBUG][logstash.outputs.file    ] config LogStash::Outputs::File/@write_behavior = "append"
[2019-04-01T11:42:33,773][INFO ][logstash.pipeline        ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>2, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50}
[2019-04-01T11:42:34,570][TRACE][logstash.inputs.file     ] Registering file input {:path=>["C:/LSData/Input/logstashi.log"]}
[2019-04-01T11:42:34,648][INFO ][logstash.inputs.file     ] No sincedb_path set, generating one based on the "path" setting {:sincedb_path=>"C:\\Logstash\\data2/plugins/inputs/file/.sincedb_0863f232a16f84ddd5407498460a2364", :path=>["C:/LSData/Input/logstashi.log"]}
[2019-04-01T11:42:34,703][INFO ][logstash.pipeline        ] Pipeline started successfully {:pipeline_id=>"main", :thread=>"#<Thread:0x199ad410 run>"}
[2019-04-01T11:42:34,750][TRACE][logstash.agent           ] Converge results {:success=>true, :failed_actions=>[], :successful_actions=>["id: main, action_type: LogStash::PipelineAction::Create"]}
[2019-04-01T11:42:34,828][INFO ][logstash.agent           ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[2019-04-01T11:42:34,844][DEBUG][logstash.instrument.periodicpoller.cgroup] One or more required cgroup files or directories not found: /proc/self/cgroup, /sys/fs/cgroup/cpuacct, /sys/fs/cgroup/cpu
[2019-04-01T11:42:34,922][INFO ][filewatch.observingtail  ] START, creating Discoverer, Watch with file and sincedb collections
[2019-04-01T11:42:34,953][DEBUG][logstash.agent           ] Starting puma
[2019-04-01T11:42:35,000][TRACE][filewatch.sincedbcollection] open: reading from C:\Logstash\data2/plugins/inputs/file/.sincedb_0863f232a16f84ddd5407498460a2364
[2019-04-01T11:42:35,031][TRACE][filewatch.sincedbcollection] open: count of keys read: 0
[2019-04-01T11:42:34,984][DEBUG][logstash.agent           ] Trying to start WebServer {:port=>9600}
[2019-04-01T11:42:35,063][TRACE][filewatch.discoverer     ] discover_files {"count"=>0}
[2019-04-01T11:42:35,188][DEBUG][logstash.api.service     ] [api-service] start
[2019-04-01T11:42:35,563][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}
[2019-04-01T11:42:35,859][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2019-04-01T11:42:35,875][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2019-04-01T11:42:35,875][DEBUG][logstash.outputs.file    ] Starting flush cycle
[2019-04-01T11:42:37,891][DEBUG][logstash.outputs.file    ] Starting flush cycle
[2019-04-01T11:42:39,740][DEBUG][logstash.pipeline        ] Pushing flush onto pipeline {:pipeline_id=>"main", :thread=>"#<Thread:0x199ad410 sleep>"}

There is nothing in the sincedb (count of keys read: 0) and it is not finding the input file (discover_files {"count"=>0}). Are you sure that that file exists?

It works now!! So the culprit was my config.data path in the logstash.yml file...my path had the backslash instead of the forward slash... I can't believe it haha.

Thank you so much for your help!

2 Likes

So it ended up working that one time when I modified the path in the logstash.yml (The output was updated automatically, but did not continue to work afterwards.

[2019-04-01T13:11:29,912][INFO ][logstash.outputs.file ] Opening file {:path=>"C:/LSData/Output/logstasho.log"}

New entries in the input file did not produce new entries in the output file and restarting logstash did not provide any results either.. weird.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.