Logstash skips file while reading config

Logstash just starts and shuts down due to configpathloader error. Please explain what the issue is. Below is the error -

[DEBUG][logstash.config.source.local.configpathloader] Skipping the following files while reading config since they don't match the specified glob pattern {:files=>["/home/mehak/Documents/logstash-7.4.0/CONTRIBUTORS", "/home/mehak/Documents/logstash-7.4.0/Gemfile", "/home/mehak/Documents/logstash-7.4.0/Gemfile.lock", "/home/mehak/Documents/logstash-7.4.0/LICENSE.txt", "/home/mehak/Documents/logstash-7.4.0/NOTICE.TXT", "/home/mehak/Documents/logstash-7.4.0/bin", "/home/mehak/Documents/logstash-7.4.0/config", "/home/mehak/Documents/logstash-7.4.0/data", "/home/mehak/Documents/logstash-7.4.0/lib", "/home/mehak/Documents/logstash-7.4.0/logs", "/home/mehak/Documents/logstash-7.4.0/logstash-core", "/home/mehak/Documents/logstash-7.4.0/logstash-core-plugin-api", "/home/mehak/Documents/logstash-7.4.0/modules", "/home/mehak/Documents/logstash-7.4.0/tools", "/home/mehak/Documents/logstash-7.4.0/vendor", "/home/mehak/Documents/logstash-7.4.0/x-pack"]}
[2019-10-29T13:14:14,614][DEBUG][logstash.config.source.local.configpathloader] Reading config file {:config_file=>"/home/mehak/Documents/logstash-7.4.0/logstash.conf"}
[2019-10-29T13:14:14,791][DEBUG][logstash.agent ] Converging pipelines state {:actions_count=>1}
[2019-10-29T13:14:14,817][DEBUG][logstash.agent ] Executing action {:action=>LogStash::PipelineAction::Create/pipeline_id:main}
[2019-10-29T13:14:15,125][ERROR][logstash.agent ] Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"LogStash::ConfigurationError", :message=>"Expected one of #, input, filter, output at line 2, column 1 (byte 2) after \n", :backtrace=>["/home/mehak/Documents/logstash-7.4.0/logstash-core/lib/logstash/compiler.rb:41:in compile_imperative'", "/home/mehak/Documents/logstash-7.4.0/logstash-core/lib/logstash/compiler.rb:49:in compile_graph'", "/home/mehak/Documents/logstash-7.4.0/logstash-core/lib/logstash/compiler.rb:11:in block in compile_sources'", "org/jruby/RubyArray.java:2584:in map'", "/home/mehak/Documents/logstash-7.4.0/logstash-core/lib/logstash/compiler.rb:10:in compile_sources'", "org/logstash/execution/AbstractPipelineExt.java:153:in initialize'", "org/logstash/execution/JavaBasePipelineExt.java:47:in initialize'", "/home/mehak/Documents/logstash-7.4.0/logstash-core/lib/logstash/java_pipeline.rb:26:in initialize'", "/home/mehak/Documents/logstash-7.4.0/logstash-core/lib/logstash/pipeline_action/create.rb:36:in execute'", "/home/mehak/Documents/logstash-7.4.0/logstash-core/lib/logstash/agent.rb:326:in block in converge_state'"]}

Below is the logstash.config file-

filebeat.inputs:

  • type: log
    paths:
    • /var/log/system.log
    • /var/log/wifi.log
  • type: log
    paths:
    • "/var/log/apache2/*"
      fields:
      apache: true
      fields_under_root: true

output.elasticsearch:
hosts: ["https://localhost:9200"]
index: "filebeat-%{[agent.version]}-%{+yyyy.MM.dd}"
ssl.certificate_authorities: ["/etc/pki/root/ca.pem"]
ssl.certificate: "/etc/pki/client/cert.pem"
ssl.key: "/etc/pki/client/cert.key"

I moved your post to Logstash, as it is a Logstash related question.

If I get this right, you try to use a beats config in logstash. Logstash uses another format for its config files. Something along the lines:

input{
}
filter{
}
output{
}

See here: https://www.elastic.co/guide/en/logstash/current/configuration.html

Hi @st3inbeiss, I fixed the logstahs.conf file and it looks like this now-

Read input from filebeat by listening to port 5044 on which filebeat will send the data

input {
beats {
type => "test"
port => "5044"
}
}

filter {
#If log line contains tab character followed by 'at' then we will tag that entry as stacktrace
if [message] =~ "\tat" {
grok {
match => ["message", "^(\tat)"]
add_tag => ["stacktrace"]
}
}

}

output {

stdout {
codec => rubydebug
}

Sending properly parsed log events to elasticsearch

elasticsearch {
hosts => ["localhost:9200"]
}
}

But i still have the skipping files error
2019-10-30T10:44:43,602][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2019-10-30T10:44:43,609][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"7.4.0"}
[2019-10-30T10:44:43,687][DEBUG][logstash.agent ] Setting up metric collection
[2019-10-30T10:44:43,812][DEBUG][logstash.instrument.periodicpoller.os] Starting {:polling_interval=>5, :polling_timeout=>120}
[2019-10-30T10:44:44,273][DEBUG][logstash.instrument.periodicpoller.jvm] Starting {:polling_interval=>5, :polling_timeout=>120}
[2019-10-30T10:44:44,804][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2019-10-30T10:44:44,817][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2019-10-30T10:44:44,861][DEBUG][logstash.instrument.periodicpoller.persistentqueue] Starting {:polling_interval=>5, :polling_timeout=>120}
[2019-10-30T10:44:44,887][DEBUG][logstash.instrument.periodicpoller.deadletterqueue] Starting {:polling_interval=>5, :polling_timeout=>120}
[2019-10-30T10:44:45,014][DEBUG][logstash.agent ] Starting agent
[2019-10-30T10:44:45,298][DEBUG][logstash.config.source.local.configpathloader] Skipping the following files while reading config since they don't match the specified glob pattern {:files=>

This is a non-issue. It is a debug message that is routinely logged.

This debug message is followed by an error-
[2019-10-30T11:17:48,727][ERROR][logstash.config.sourceloader] No configuration found in the configured sources

This is my pipelines.yml file content and I created two files called first-pipeline.conf and second-pipeline.conf in logstash folder with just input{} filter{} output{} to pass something in.

  • pipeline.id: test
    pipeline.workers: 1
    path.config: "/tmp/logstash/first-pipeline.config
    pipeline.batch.size: 1

  • pipeline.id: another_test
    queue.type: persisted
    path.config: "/tmp/logstash/second-pipeline.config"

Made progress by creating a pipeline.conf file in logstash and running the command bin/logstash -f pieline.conf

After the DEBUG- Skipping the following files statement, I dont get any errors but still logstash runs for seconds only. What is the issue here? Where would I give location of the logs I want to be passed?

2019-10-30T11:35:15,339][DEBUG][logstash.config.source.local.configpathloader] Reading config file {:config_file=>"/home/mehak/Documents/logstash-7.4.0/pipeline.conf"}
[2019-10-30T11:35:15,423][DEBUG][logstash.agent ] Converging pipelines state {:actions_count=>1}
[2019-10-30T11:35:15,466][DEBUG][logstash.agent ] Executing action {:action=>LogStash::PipelineAction::Create/pipeline_id:main}
[2019-10-30T11:35:15,799][DEBUG][org.logstash.secret.store.SecretStoreFactory] Attempting to exists or secret store with implementation: org.logstash.secret.store.backend.JavaKeyStore
[2019-10-30T11:35:16,142][DEBUG][logstash.javapipeline ] Starting pipeline {:pipeline_id=>"main"}
[2019-10-30T11:35:16,191][WARN ][org.logstash.instrument.metrics.gauge.LazyDelegatingGauge][main] A gauge metric of an unknown type (org.jruby.RubyArray) has been create for key: cluster_uuids. This may result in invalid serialization. It is recommended to log an issue to the responsible developer/development team.
[2019-10-30T11:35:16,194][INFO ][logstash.javapipeline ][main] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>500, :thread=>"#<Thread:0x26aff7df run>"}
[2019-10-30T11:35:16,240][INFO ][logstash.javapipeline ][main] Pipeline started {"pipeline.id"=>"main"}
[2019-10-30T11:35:16,268][DEBUG][logstash.javapipeline ] Pipeline started successfully {:pipeline_id=>"main", :thread=>"#<Thread:0x26aff7df run>"}
[2019-10-30T11:35:16,345][DEBUG][logstash.javapipeline ][main] Input plugins stopped! Will shutdown filter/output workers. {:pipeline_id=>"main", :thread=>"#<Thread:0x26aff7df run>"}
[2019-10-30T11:35:16,346][DEBUG][org.logstash.execution.PeriodicFlush][main] Pushing flush onto pipeline.
[2019-10-30T11:35:16,385][DEBUG][logstash.javapipeline ][main] Shutdown waiting for worker thread {:pipeline_id=>"main", :thread=>"#<Thread:0x3b090fe2 run>"}
[2019-10-30T11:35:16,425][DEBUG][logstash.javapipeline ][main] Shutdown waiting for worker thread {:pipeline_id=>"main", :thread=>"#<Thread:0x7664fdfb run>"}
[2019-10-30T11:35:16,432][INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>}
[2019-10-30T11:35:16,438][DEBUG][logstash.javapipeline ][main] Shutdown waiting for worker thread {:pipeline_id=>"main", :thread=>"#<Thread:0x1710ef6 dead>"}
[2019-10-30T11:35:16,438][DEBUG][logstash.javapipeline ][main] Shutdown waiting for worker thread {:pipeline_id=>"main", :thread=>"#<Thread:0x7dac9bdd dead>"}
[2019-10-30T11:35:16,448][DEBUG][logstash.javapipeline ][main] Pipeline has been shutdown {:pipeline_id=>"main", :thread=>"#<Thread:0x26aff7df run>"}
[2019-10-30T11:35:16,491][DEBUG][logstash.agent ] Starting puma
[2019-10-30T11:35:16,510][DEBUG][logstash.agent ] Trying to start WebServer {:port=>9600}
[2019-10-30T11:35:16,522][DEBUG][logstash.instrument.periodicpoller.os] Stopping
[2019-10-30T11:35:16,555][DEBUG][logstash.instrument.periodicpoller.jvm] Stopping
[2019-10-30T11:35:16,558][DEBUG][logstash.instrument.periodicpoller.persistentqueue] Stopping
[2019-10-30T11:35:16,560][DEBUG][logstash.instrument.periodicpoller.deadletterqueue] Stopping
[2019-10-30T11:35:16,578][DEBUG][logstash.agent ] Shutting down all pipelines {:pipelines_count=>0}
[2019-10-30T11:35:16,579][DEBUG][logstash.agent ] Converging pipelines state {:actions_count=>0}
[2019-10-30T11:35:16,586][DEBUG][logstash.api.service ] [api-service] start
[2019-10-30T11:35:16,808][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
[2019-10-30T11:35:21,814][INFO ][logstash.runner ] Logstash shut down

This helped, thanks!

What does your pipeline configuration look like?

Logstash is creating index in elasticsearch but no data is sent. So my question is- how to check pipeline is created and data is being sent from filebeats to logstash port?

Index details:
yellow open logstash-2019.10.30-000001 WnnV21usSDqM18yvaVowZQ 1 1 0 0 283b 283b

No data document in index:
{
"count" : 0,
"_shards" : {
"total" : 1,
"successful" : 1,
"skipped" : 0,
"failed" : 0
}
}

Here is pipeline.conf file:
// Read input from filebeat by listening to port 5044 on which filebeat will send the data
input {
beats {
type => "another_test"
port => "5044"
}
}
filter {
//If log line contains tab character followed by 'at' then we will tag that entry as stacktrace
// if [message] =~ "\tat" {
// grok {
// match => ["message", "^(\tat)"]
// add_tag => ["stacktrace"]
// }
//}
}
output {
//stdout {
// codec => rubydebug
//}
/ /Sending properly parsed log events to elasticsearch
elasticsearch {
hosts => ["localhost:9200"]
}
}

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.