Logstash ignoring a particular pipeline defined in pipelines.yml :No configuration found in the configured sources

I am a new user of ELK Stack and my main purpose is visualizing data from SQL tables. The computer I use ELK on is running 64-bit Windows 10 Enterprise Edition. The installation consists of Elastic Search v 6.2.2, Logstash v 6.2.2 and Kibana v 6.2.2 alongside JDBC 6.0 driver for Windows.

I have edited the pipelines.yml file in order for Logstash to store data on Elastic Stack. For experimental purposes I have two distinct pipelines: Train and YKB, namely. Apart from the commented lines my YAML configuration file consists of 4 lines. This file in turn refers to two other configuration -sql-pipeline_Training.conf, sql-pipeline_YKBÜ.conf- files which store the actual pipeline. The Train pipeline is ignored completely and when I have it as the only pipeline on the pipelines.yml file I get the error No configuration found in the configured sources. in Logstash.

The pipeline sql-pipeline_Training.conf works fine by itself when Logstash is run with the command flag '-f' in Powershell. However, no matter what I do I could not get it work alongside with the other pipeline (sql-pipeline_YKBÜ.conf) by calling logstash without arguments and flags. Even having a single pipeline defined in pipelines.yml file did not remedy the situation. I am perplexed by this and the single file I had to fallback to gets pretty cluttered. I tried adding the most basic pipeline that prints the stdin to stdout and it worked well together with SQL pipeline YKBÜ.

May the problem caused by running two JDBC instances at the same time? However, deleting other SQL pipelines did not remedy the situation and this pipeline (Training) is not included in the count. I would appreciate your suggestions/help on this topic to solve these issues.

Thanks in advance

P.S: I have added the pipelines.yml and the corresponding SQL files to this post.

pipelines.yml
sql-pipeline_Training.conf
sql-pipeline_YKBÜ.conf

Hello the problem is still persistent with my installation. Does anyone else experience similar issues? The error line changes when I comment out or delete the entire pipelines.yml file; this time it says: Pipelines YAML file is empty.

After some more experimentation I think the problem might be related with the following message displayed when debug logging is enabled. I enclosed a snippet from the debugging information and it seems that the files corresponding to not working pipelines are skipped. I boldfaced them in the snippet from the PowerShell window below.

[2018-03-24T20:11:49,468][DEBUG][logstash.config.source.local.configpathloader] Skipping the following files while reading config since they don't match the specified glob pattern {:files=>["c:/Users/ongun.arisev/Downloads/ElasticStack/logstash-6.2.2/config/first-pipeline.conf", "c:/Users/ongun.arisev/Downloads/ElasticStack/logstash-6.2.2/config/jvm.options", "c:/Users/ongun.arisev/Downloads/ElasticStack/logstash-6.2.2/config/log4j2.properties", "c:/Users/ongun.arisev/Downloads/ElasticStack/logstash-6.2.2/config/logstash.yml", "c:/Users/ongun.arisev/Downloads/ElasticStack/logstash-6.2.2/config/pipelines.yml", "c:/Users/ongun.arisev/Downloads/ElasticStack/logstash-6.2.2/config/pipelines_backup.yml", "c:/Users/ongun.arisev/Downloads/ElasticStack/logstash-6.2.2/config/sql-pipeline.conf", "c:/Users/ongun.arisev/Downloads/ElasticStack/logstash-6.2.2/config/sql-pipeline_train.conf", "c:/Users/ongun.arisev/Downloads/ElasticStack/logstash-6.2.2/config/startup.options", "c:/Users/ongun.arisev/Downloads/ElasticStack/logstash-6.2.2/config/training.conf"]}

I do not know why they are skipped, how can I stop them from being ignored? Note that the file corresponding to the working pipeline (sql-pipeline_YKBÜ.conf) is not in the list.

In your pipelines.yml file, remove the drive letters, leave the leading forward slash.

-  pipeline.id: Train
   path.config: "/Users/ongun.arisev/Downloads/ElasticStack/logstash-6.2.2/sql-pipeline_Training.conf"
-  pipeline.id: YKB
   path.config: "/Users/ongun.arisev/Downloads/ElasticStack/logstash-6.2.2/sql-pipeline_YKBÜ.conf"
#-  pipeline.id: SQL
#   path.config: "/Users/ongun.arisev/Downloads/ElasticStack/logstash-6.2.2/sql-pipeline.conf"

Dear @wwalker thanks for the reply this is working in addition to using relative paths. I even tried concatenate matched files array with unmatched files array in ./logstash-core/lib/logstash/config/source/local.rb file. But there can be two or more drives on the Windows computer. How is the drive letter information is conveyed then?

I have no clue. I have a production setup we are half-assed using at work where Elastic Stack is installed on the D: drive and not including the drive letter still works. I'm guessing that, either it uses the drive letter of where the application resides OR it just looks for any paths that match across any volumes on the system. Easy to test...just hasn't been something I needed to test, lol.

BTW, don't forget to hit the checkmark on posts that are the answer, helps others out looking for the same information.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.