Multiple pipelines: Failed to read pipelines yaml file


i've managed to run elastic search with kibana and logstash with JDBC mssql driver for ingesting data.

Now i'm trying to run multiple pipelines but i keep getting an error.

ELK : 7.4.0 version

pipeline.yml :

# List of pipelines to be loaded by Logstash
# This document must be a list of dictionaries/hashes, where the keys/values are pipeline settings.
# Default values for omitted settings are read from the `logstash.yml` file.
# When declaring multiple pipelines, each MUST have its own ``.
# Example of two pipelines:
 - relation
   #pipeline.workers: 1
   path.config: "\Elastic_Stack\logstash-7.4.0\config\conf_relation.conf"
#   pipeline.batch.size: 1
#   config.string: "input { generator {} } filter { sleep { time => 1 } } output { stdout { codec => dots } }"
 - connectionpoint
#   queue.type: persisted
path.config: "\Elastic_Stack\logstash-7.4.0\config\conf_connectionpoint.conf"

each seperate conf file runs with /bin/logstash -f conf_relation of /bin/logstash -f conf_connectionpoint.conf

directory of pipeline.yml : "C:\Elastic_Stack\logstash-7.4.0\config\pipelines.yml"

when running logstash i get this error:

Java HotSpot(TM) 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by org.jruby.runtime.encoding.EncodingService (file:/C:/Elastic_Stack/logstash-7.4.0/logstash-core/lib/jars/jruby-complete- to field
WARNING: Please consider reporting this to the maintainers of org.jruby.runtime.encoding.EncodingService
WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
WARNING: All illegal access operations will be denied in a future release
Thread.exclusive is deprecated, use Thread::Mutex
Sending Logstash logs to C:/Elastic_Stack/logstash-7.4.0/logs which is now configured via
ERROR: Failed to read pipelines yaml file. Location: C:/Elastic_Stack/logstash-7.4.0/config/pipelines.yml
  bin/logstash -f CONFIG_PATH [-t] [-r] [] [-w COUNT] [-l LOG]
  bin/logstash -e CONFIG_STR [-t] [--log.level fatal|error|warn|info|debug|trace] [-w COUNT] [-l LOG]
  bin/logstash -i SHELL [--log.level fatal|error|warn|info|debug|trace]
  bin/logstash -V [--log.level fatal|error|warn|info|debug|trace]
  bin/logstash --help
[2019-12-04T19:45:42,649][ERROR][org.logstash.Logstash    ] java.lang.IllegalStateException: Logstash stopped processing because of an error: (SystemExit) exit

any help please !



I have almost the same problem but on ELK 7.5.0
what does your conf_relation.conf look like ?

my conf file looks like this input

    jdbc {
        jdbc_connection_string => "jdbc:sqlserver://HostName\instanceName;database=DBName;user=UserName;password=Password" 
    # The path to your downloaded jdbc driver
    jdbc_driver_library => "C:\Elastic_Stack\logstash-7.4.2\logstash-core\lib\jars\sqljdbc42.jar"
    # The name of the driver class for SqlServer
    jdbc_driver_class => ""
    columns_charset => { "version" => "ISO-8859-1" } #ISO-8859-1, UTF-8
    # Query for testing purpose
    statement => "SELECT id, lang, name, legalform, vatnumber,street, streetnumber, reference, nacecode2008, version_calc from rel where version_calc > :sql_last_value"
    use_column_value => true   
    tracking_column => "version_calc"
    #tracking_column_type => "numeric"
    schedule => "/30 * * * * *" #30 sec = /30 * * * * *

    #clean_run => true 

output {  
    elasticsearch {
        hosts => ["localhost:9200"]
        index =>  "relation_test"
        #document_id => "datecreated"   
stdout { codec => rubydebug }
  • relation
    pipeline.workers: 1
    path.config: "\Elastic_Stack\logstash-7.4.0\config\conf_relation.conf"

just run your pipeline.yml with this configuration and see if you get it working.

your current pipeline.yml isn't specifying any workers for either of the pipelines so i think as far as logstash is concerned you've asked it to perform a task, but not given any resources to complete the task.

i have something similar that works

  • audit
    pipeline.workers: 1
    path.config: /etc/logstash/conf.d/live/client_A45.conf"
  • nginx
    pipeline.workers: 3
    path.config: /etc/logstash/conf.d/live/client_A67.conf"
  • security
    pipeline.workers: 1
    path.config: /etc/logstash/conf.d/live/client_B45.conf"
  • file_changes
    pipeline.workers: 2
    path.config: "/etc/logstash/conf.d/live/client_B23.conf"

just remember... workers = CPU.... you want workers to match the workload expected.. i.e only a few light logs per worker is ok, but heavy logs take longer to parse and require more CPU to make it parse faster. Also using sincedb to keep timestamp access to log files in checks is a smart way to control log read times and also putting an interval per .conf file and try to stagger them... i.e B45.conf and B23.conf access logs on same client infrastructure, but one is every 23 minutes checking for changes, the other is every 45 minutes, but inside the .conf files we adjust and change and massage until we get what we need working the way we want it to for a while until its done and the logs coming into Elastic are parsed the way we want.

I've tried your reply but stil getting an error

pipelines.yml is now only this

- relation
  pipeline.workers : 1
  path.config : "\Elastic_Stack\logstash-7.4.0\config\conf_relation.conf"

- connectionpoint
  pipeline.workers : 1
  path.config : "\Elastic_Stack\logstash-7.4.0\config\conf_connectionpoint.conf"

error i'm getting is

 [2019-12-05T12:42:59,374][ERROR][logstash.agent           ] Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"LogStash::ConfigurationError", :message=>"Expected one of #, input, filter, output at line 1, column 1 (byte 1)", :backtrace=>["C:/Elastic_Stack/logstash-7.4.0/logstash-core/lib/logstash/compiler.rb:41:in `compile_imperative'", "C:/Elastic_Stack/logstash-7.4.0/logstash-core/lib/logstash/compiler.rb:49:in `compile_graph'", "C:/Elastic_Stack/logstash-7.4.0/logstash-core/lib/logstash/compiler.rb:11:in `block in compile_sources'", "org/jruby/ `map'", "C:/Elastic_Stack/logstash-7.4.0/logstash-core/lib/logstash/compiler.rb:10:in `compile_sources'", "org/logstash/execution/ `initialize'", "org/logstash/execution/ `initialize'", "C:/Elastic_Stack/logstash-7.4.0/logstash-core/lib/logstash/java_pipeline.rb:26:in `initialize'", "C:/Elastic_Stack/logstash-7.4.0/logstash-core/lib/logstash/pipeline_action/create.rb:36:in `execute'", "C:/Elastic_Stack/logstash-7.4.0/logstash-core/lib/logstash/agent.rb:326:in `block in converge_state'"]}

I don't see where i'm wrong does your .conf files be anywhere in a specific folder, or is my yml still wrong?

tried removing, the c:\ as some people referring to as a problem
changed indentation in the yml

please help :slight_smile:

your conf... is that the full file or did you cut out the top part?
you've got no input statement
something like:

input {
  jdbc {
    jdbc_driver_library => "mysql-connector-java-5.1.36-bin.jar"
    jdbc_driver_class => "com.mysql.jdbc.Driver"
    jdbc_connection_string => "jdbc:mysql://localhost:3306/mydb"
    jdbc_user => "mysql"
    parameters => { "favorite_artist" => "Beethoven" }
    schedule => "* * * * *"
    statement => "SELECT * from songs where artist = :favorite_artist"

yes i cutted out the top part. But separately both files work with logstash
when running /bin/logstash -f conf_relation.conf or /bin/logstash -f conf_connectionpoint.conf

it's when i try to run them with pipelines.yml

are you windows or linux ?

windows 10

im linux so our pathing structure is different but i think you have to // the path for windows or its \ instead of /


did you already tried?

path.config : "c:\\Elastic_Stack\\logstash-7.4.0\\config\\conf_connectionpoint.conf"

That is how I have configured on Windows 10. (So with double backslashes)