Pipeline Fail, failed to load pipeline. Error: Expected one of [ \\t\\r\\n], \"#\", \"input\", \"filter\", \"output\" at line 21

Running a pipeline with two config file:
Error after running: logstash -f .\config\pipelines.yml

Error:
[ERROR][logstash.agent ] Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"LogStash::ConfigurationError", :message=>"Expected one of [ \t\r\n], "#", "input", "filter", "output" at line 21, column 1 (byte 666) after "

Config file:

# address_pipeline.conf

input {
  jdbc {
    clean_run => true
    jdbc_driver_library => "D:\ELK_tools\logstash-conf_jdbc\ojdbc11.jar"
    jdbc_driver_class => "oracle.jdbc.driver.OracleDriver"
    jdbc_connection_string => "jdbc:oracle:thin:@localhost:1521:orcl"
    jdbc_user => "system"
    jdbc_password => "XXX"
    schedule => "*/5 * * * * *"
    statement => "SELECT * FROM .table1 WHERE date> :sql_last_value AND date< CURRENT_TIMESTAMP ORDER BY date ASC"
    use_column_value => true
    tracking_column => "date"
    tracking_column_type => "timestamp"
    record_last_run => true
  }
}

filter {
date { match => ["XXX" , "ISO8601"] }
}

output {
  elasticsearch {
    hosts => ["http://localhost:9200/"]
    index => "XXX"
    user => "elastic"
    password => "XXXX"
    ssl => false
    ssl_certificate_verification => XXX
    codec => plain { charset => "UTF-8" }
  }
}

This is not correct, if you want to use pipelines.yml with your pipelines you do not use the -f argument pointing to it, you just run Logstash without any -f parameter and it will pick up the pipelines configured in pipelines.yml.

Tried to execute without any file parameter
"D:\ELK_tools\logstash-8.8.2-windows-x86_64\logstash-8.8.2\bin>logstash"

Post execution of above command i"m getting this error:
[ERROR][logstash.config.sourceloader] No configuration found in the configured sources.

My YML is in config folder
"D:\ELK_tools\logstash-8.8.2-windows-x86_64\logstash-8.8.2\config"

I'm a newbie to this! Thanks for the suggestion.

Able to run the pipeline by using:

"bin> logstash --path.settings /D:\ELK_tools\logstash-8.8.2-windows-x86_64\logstash-8.8.2\config --path.config=/D:\ELK_tools\logstash-8.8.2-windows-x86_64\logstash-8.8.2\pipeline"

Problem as I check the indexes mapped from kibana by the result set is combined of both query run (Example - 1st config file pulls 100 rows and 2nd config file pulls 222 rows both index mapped shows 333) there are common field names in both query run

1st config file:

output {
elasticsearch {
hosts => ["http://localhost:9200/"]
index => "delivery_trial"
user => "elastic"
password => "XXX"
ssl => false
ssl_certificate_verification => false
codec => plain { charset => "UTF-8" }
}
}

2nd Config file:

output {
elasticsearch {
hosts => ["http://localhost:9200/"]
index => "address_trial"
user => "elastic"
password => "XXXX"
ssl => false
ssl_certificate_verification => false
codec => plain { charset => "UTF-8" }
}
}

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.