Logstash Sourceloader - No configuration found in the configured sources

I have a config file that was running alright until this morning. The contents are:

input {
  file {
    path => "C:/Users/data/test.csv"
    start_position => "beginning"
    sincedb_path => "/dev/null"
  }
}
filter {
  csv {
    separator => ","
    columns => [ "food", "food_desc" ]
  }
}
output {
  elasticsearch {
    hosts => "localhost:9200"
    index => "food"
    document_type => "description"
  }
  stdout {codec => rubydebug}
}

My debug log for $ ./bin/logstash --debug -f C:/Users/Documents/data/logstash_test.config shows:

commented below

I tried uninstalling and reinstalling logstash but to no avail. Any insights are appreciated!

**Error Log**

Sending Logstash's logs to C:/Users/Documents/logstash/logstash-6.3.0/logs which is now configured via log4j2.properties
[2018-07-03T11:01:25,932][DEBUG][logstash.modules.scaffold] Found module {:module_name=>"fb_apache", :directory=>"C:/Users/Documents/logstash/logstash-6.3.0/modules/fb_apache/configuration"}
[2018-07-03T11:01:25,938][DEBUG][logstash.plugins.registry] Adding plugin to the registry {:name=>"fb_apache", :type=>:modules, :class=>#<LogStash::Modules::Scaffold:0x7e81f6d0 @directory="C:/Users/Documents/logstash/logstash-6.3.0/modules/fb_apache/configuration", @module_name="fb_apache", @kibana_version_parts=["6", "0", "0"]>}
[2018-07-03T11:01:25,939][DEBUG][logstash.modules.scaffold] Found module {:module_name=>"netflow", :directory=>"C:/Users/Documents/logstash/logstash-6.3.0/modules/netflow/configuration"}
[2018-07-03T11:01:25,940][DEBUG][logstash.plugins.registry] Adding plugin to the registry {:name=>"netflow", :type=>:modules, :class=>#<LogStash::Modules::Scaffold:0x288b8b5b @directory="C:/Users/Documents/logstash/logstash-6.3.0/modules/netflow/configuration", @module_name="netflow", @kibana_version_parts=["6", "0", "0"]>}
[2018-07-03T11:01:26,054][DEBUG][logstash.runner          ] -------- Logstash Settings (* means modified) ---------
[2018-07-03T11:01:26,055][DEBUG][logstash.runner          ] node.name: "test"
[2018-07-03T11:01:26,055][DEBUG][logstash.runner          ] *path.config: "C:UsersDocumentsdatalogstash_test.config"
[2018-07-03T11:01:26,055][DEBUG][logstash.runner          ] path.data: "C:/Users/Documents/logstash/logstash-6.3.0/data"
[2018-07-03T11:01:26,055][DEBUG][logstash.runner          ] modules.cli: []
[2018-07-03T11:01:26,055][DEBUG][logstash.runner          ] modules: []
[2018-07-03T11:01:26,056][DEBUG][logstash.runner          ] modules_setup: false
[2018-07-03T11:01:26,056][DEBUG][logstash.runner          ] config.test_and_exit: false
[2018-07-03T11:01:26,056][DEBUG][logstash.runner          ] config.reload.automatic: false
[2018-07-03T11:01:26,056][DEBUG][logstash.runner          ] config.reload.interval: 3000000000
[2018-07-03T11:01:26,056][DEBUG][logstash.runner          ] config.support_escapes: false
[2018-07-03T11:01:26,056][DEBUG][logstash.runner          ] metric.collect: true
[2018-07-03T11:01:26,056][DEBUG][logstash.runner          ] pipeline.id: "main"
[2018-07-03T11:01:26,057][DEBUG][logstash.runner          ] pipeline.system: false
[2018-07-03T11:01:26,057][DEBUG][logstash.runner          ] pipeline.workers: 8
[2018-07-03T11:01:26,057][DEBUG][logstash.runner          ] pipeline.output.workers: 1
[2018-07-03T11:01:26,057][DEBUG][logstash.runner          ] pipeline.batch.size: 125
[2018-07-03T11:01:26,057][DEBUG][logstash.runner          ] pipeline.batch.delay: 50
[2018-07-03T11:01:26,057][DEBUG][logstash.runner          ] pipeline.unsafe_shutdown: false
[2018-07-03T11:01:26,058][DEBUG][logstash.runner          ] pipeline.java_execution: false
[2018-07-03T11:01:26,058][DEBUG][logstash.runner          ] pipeline.reloadable: true
[2018-07-03T11:01:26,058][DEBUG][logstash.runner          ] path.plugins: []
[2018-07-03T11:01:26,058][DEBUG][logstash.runner          ] config.debug: false
[2018-07-03T11:01:26,058][DEBUG][logstash.runner          ] *log.level: "debug" (default: "info")

**Continued Below**

Error Log Cont.

[2018-07-03T11:01:26,058][DEBUG][logstash.runner ] version: false
[2018-07-03T11:01:26,058][DEBUG][logstash.runner ] help: false
[2018-07-03T11:01:26,059][DEBUG][logstash.runner ] log.format: "plain"
[2018-07-03T11:01:26,059][DEBUG][logstash.runner ] http.host: "127.0.0.1"
[2018-07-03T11:01:26,059][DEBUG][logstash.runner ] http.port: 9600..9700
[2018-07-03T11:01:26,059][DEBUG][logstash.runner ] http.environment: "production"
[2018-07-03T11:01:26,059][DEBUG][logstash.runner ] queue.type: "memory"
[2018-07-03T11:01:26,059][DEBUG][logstash.runner ] queue.drain: false
[2018-07-03T11:01:26,059][DEBUG][logstash.runner ] queue.page_capacity: 67108864
[2018-07-03T11:01:26,060][DEBUG][logstash.runner ] queue.max_bytes: 1073741824
[2018-07-03T11:01:26,060][DEBUG][logstash.runner ] queue.max_events: 0
[2018-07-03T11:01:26,060][DEBUG][logstash.runner ] queue.checkpoint.acks: 1024
[2018-07-03T11:01:26,060][DEBUG][logstash.runner ] queue.checkpoint.writes: 1024
[2018-07-03T11:01:26,060][DEBUG][logstash.runner ] queue.checkpoint.interval: 1000
[2018-07-03T11:01:26,060][DEBUG][logstash.runner ] dead_letter_queue.enable: false
[2018-07-03T11:01:26,060][DEBUG][logstash.runner ] dead_letter_queue.max_bytes: 1073741824
[2018-07-03T11:01:26,061][DEBUG][logstash.runner ] slowlog.threshold.warn: -1
[2018-07-03T11:01:26,061][DEBUG][logstash.runner ] slowlog.threshold.info: -1
[2018-07-03T11:01:26,061][DEBUG][logstash.runner ] slowlog.threshold.debug: -1
[2018-07-03T11:01:26,061][DEBUG][logstash.runner ] slowlog.threshold.trace: -1
[2018-07-03T11:01:26,061][DEBUG][logstash.runner ] keystore.classname: "org.logstash.secret.store.backend.JavaKeyStore"
[2018-07-03T11:01:26,061][DEBUG][logstash.runner ] keystore.file: "C:/Users/Documents/logstash/logstash-6.3.0/config/logstash.keystore"
[2018-07-03T11:01:26,061][DEBUG][logstash.runner ] path.queue: "C:/Users/Documents/logstash/logstash-6.3.0/data/queue"
[2018-07-03T11:01:26,061][DEBUG][logstash.runner ] path.dead_letter_queue: "C:/Users/Documents/logstash/logstash-6.3.0/data/dead_letter_queue"
[2018-07-03T11:01:26,062][DEBUG][logstash.runner ] path.settings: "C:/Users/Documents/logstash/logstash-6.3.0/config"
[2018-07-03T11:01:26,062][DEBUG][logstash.runner ] path.logs: "C:/Users/Documents/logstash/logstash-6.3.0/logs"
[2018-07-03T11:01:26,062][DEBUG][logstash.runner ] xpack.management.enabled: false
[2018-07-03T11:01:26,062][DEBUG][logstash.runner ] xpack.management.logstash.poll_interval: 5000000000
[2018-07-03T11:01:26,062][DEBUG][logstash.runner ] xpack.management.pipeline.id: ["main"]
[2018-07-03T11:01:26,062][DEBUG][logstash.runner ] xpack.management.elasticsearch.username: "logstash_system"
[2018-07-03T11:01:26,062][DEBUG][logstash.runner ] xpack.management.elasticsearch.url: ["https://localhost:9200"]
[2018-07-03T11:01:26,062][DEBUG][logstash.runner ] xpack.management.elasticsearch.sniffing: false
[2018-07-03T11:01:26,062][DEBUG][logstash.runner ] xpack.monitoring.enabled: false
[2018-07-03T11:01:26,062][DEBUG][logstash.runner ] xpack.monitoring.elasticsearch.url: ["http://localhost:9200"]
[2018-07-03T11:01:26,063][DEBUG][logstash.runner ] xpack.monitoring.collection.interval: 10000000000
[2018-07-03T11:01:26,063][DEBUG][logstash.runner ] xpack.monitoring.collection.timeout_interval: 600000000000
[2018-07-03T11:01:26,063][DEBUG][logstash.runner ] xpack.monitoring.elasticsearch.username: "logstash_system"
[2018-07-03T11:01:26,063][DEBUG][logstash.runner ] xpack.monitoring.elasticsearch.ssl.verification_mode: "certificate"
[2018-07-03T11:01:26,063][DEBUG][logstash.runner ] xpack.monitoring.elasticsearch.sniffing: false
[2018-07-03T11:01:26,063][DEBUG][logstash.runner ] xpack.monitoring.collection.pipeline.details.enabled: true
[2018-07-03T11:01:26,063][DEBUG][logstash.runner ] xpack.monitoring.collection.config.enabled: true
[2018-07-03T11:01:26,064][DEBUG][logstash.runner ] node.uuid: ""
[2018-07-03T11:01:26,064][DEBUG][logstash.runner ] --------------- Logstash Settings -------------------
[2018-07-03T11:01:26,110][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2018-07-03T11:01:26,167][DEBUG][logstash.agent ] Setting up metric collection
[2018-07-03T11:01:26,243][DEBUG][logstash.instrument.periodicpoller.os] Starting {:polling_interval=>5, :polling_timeout=>120}
[2018-07-03T11:01:26,393][DEBUG][logstash.instrument.periodicpoller.jvm] Starting {:polling_interval=>5, :polling_timeout=>120}
[2018-07-03T11:01:26,517][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2018-07-03T11:01:26,523][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2018-07-03T11:01:26,535][DEBUG][logstash.instrument.periodicpoller.persistentqueue] Starting {:polling_interval=>5, :polling_timeout=>120}
[2018-07-03T11:01:26,544][DEBUG][logstash.instrument.periodicpoller.deadletterqueue] Starting {:polling_interval=>5, :polling_timeout=>120}
[2018-07-03T11:01:26,592][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"6.3.0"}
[2018-07-03T11:01:26,606][DEBUG][logstash.agent ] Starting agent
[2018-07-03T11:01:26,660][ERROR][logstash.config.sourceloader] No configuration found in the configured sources.
[2018-07-03T11:01:26,709][DEBUG][logstash.agent ] Converging pipelines state {:actions_count=>0}
[2018-07-03T11:01:26,752][DEBUG][logstash.agent ] Starting puma
[2018-07-03T11:01:26,761][DEBUG][logstash.agent ] Trying to start WebServer {:port=>9600}
[2018-07-03T11:01:26,764][DEBUG][logstash.instrument.periodicpoller.os] Stopping
[2018-07-03T11:01:26,787][DEBUG][logstash.instrument.periodicpoller.jvm] Stopping
[2018-07-03T11:01:26,789][DEBUG][logstash.instrument.periodicpoller.persistentqueue] Stopping
[2018-07-03T11:01:26,789][DEBUG][logstash.instrument.periodicpoller.deadletterqueue] Stopping
[2018-07-03T11:01:26,816][DEBUG][logstash.api.service ] [api-service] start
[2018-07-03T11:01:27,039][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
[2018-07-03T11:01:32,021][DEBUG][logstash.agent ] Shutting down all pipelines {:pipelines_count=>0}
[2018-07-03T11:01:32,022][DEBUG][logstash.agent ] Converging pipelines state {:actions_count=>0}

[2018-07-03T11:01:26,055][DEBUG][logstash.runner ] *path.config: "C:UsersDocumentsdatalogstash_test.config"
[2018-07-03T11:01:26,055][DEBUG][logstash.runner ] path.data: "C:/Users/Documents/logstash/logstash-6.3.0/data"

If you select that text and click on the </> button in the toolbar above the compose window this would be a lot easier to read.

Sorry about that; updated the first part. The second portion is telling me an error has occurred when I try to update it, but will keep trying.

It's lost all the / characters. I don't use logstash on Windows, so I cannot say for sure what would cause this. Does it help if you put "" around the path?

The error isn't showing up anymore though all it's printing after creating the pipeline is:

[2018-07-03T12:14:52,888][DEBUG][logstash.outputs.elasticsearch] Found existing Elasticsearch template. Skipping template management {:name=>"logstash"}
[2018-07-03T12:14:52,898][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//localhost"]}
[2018-07-03T12:14:52,911][DEBUG][logstash.filters.csv     ] CSV parsing options {:col_sep=>",", :quote_char=>"\""}
[2018-07-03T12:14:53,564][INFO ][logstash.pipeline        ] Pipeline started successfully {:pipeline_id=>"main", :thread=>"#<Thread:0x5c9cfa91 run>"}
[2018-07-03T12:14:53,664][INFO ][logstash.agent           ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[2018-07-03T12:14:53,694][DEBUG][logstash.agent           ] Starting puma
[2018-07-03T12:14:53,707][DEBUG][logstash.agent           ] Trying to start WebServer {:port=>9600}
[2018-07-03T12:14:53,743][DEBUG][logstash.inputs.file     ] _globbed_files: C:/Users/data/test.csv: glob is: ["C:/Users/data/test.csv"]
[2018-07-03T12:14:53,745][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2018-07-03T12:14:53,747][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2018-07-03T12:14:53,775][DEBUG][logstash.api.service     ] [api-service] start
[2018-07-03T12:14:53,975][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}
[2018-07-03T12:14:58,581][DEBUG][logstash.pipeline        ] Pushing flush onto pipeline {:pipeline_id=>"main", :thread=>"#<Thread:0x5c9cfa91 sleep>"}
[2018-07-03T12:14:58,754][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2018-07-03T12:14:58,755][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2018-07-03T12:15:03,583][DEBUG][logstash.pipeline        ] Pushing flush onto pipeline {:pipeline_id=>"main", :thread=>"#<Thread:0x5c9cfa91 sleep>"}
[2018-07-03T12:15:03,759][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2018-07-03T12:15:03,760][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2018-07-03T12:15:07,802][DEBUG][logstash.inputs.file     ] _globbed_files: C:/Users/data/test.csv: glob is: ["C:/Users/data/test.csv"]
[2018-07-03T12:15:08,584][DEBUG][logstash.pipeline        ] Pushing flush onto pipeline {:pipeline_id=>"main", :thread=>"#<Thread:0x5c9cfa91 sleep>"}
[2018-07-03T12:15:08,767][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2018-07-03T12:15:08,768][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2018-07-03T12:15:13,585][DEBUG][logstash.pipeline        ] Pushing flush onto pipeline {:pipeline_id=>"main", :thread=>"#<Thread:0x5c9cfa91 sleep>"}
[2018-07-03T12:15:13,772][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2018-07-03T12:15:13,773][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2018-07-03T12:15:18,588][DEBUG][logstash.pipeline        ] Pushing flush onto pipeline {:pipeline_id=>"main", :thread=>"#<Thread:0x5c9cfa91 sleep>"}
[2018-07-03T12:15:18,777][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2018-07-03T12:15:18,777][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2018-07-03T12:15:22,816][DEBUG][logstash.inputs.file     ] _globbed_files: C:/Users/data/test.csv: glob is: ["C:/Users/data/test.csv"]
[2018-07-03T12:15:23,588][DEBUG][logstash.pipeline        ] Pushing flush onto pipeline {:pipeline_id=>"main", :thread=>"#<Thread:0x5c9cfa91 sleep>"}
[2018-07-03T12:15:23,782][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2018-07-03T12:15:23,782][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2018-07-03T12:15:28,588][DEBUG][logstash.pipeline        ] Pushing flush onto pipeline {:pipeline_id=>"main", :thread=>"#<Thread:0x5c9cfa91 sleep>"}
[2018-07-03T12:15:28,786][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2018-07-03T12:15:28,786][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2018-07-03T12:15:33,591][DEBUG][logstash.pipeline        ] Pushing flush onto pipeline {:pipeline_id=>"main", :thread=>"#<Thread:0x5c9cfa91 sleep>"}
[2018-07-03T12:15:33,790][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2018-07-03T12:15:33,791][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2018-07-03T12:15:37,827][DEBUG][logstash.inputs.file     ] _globbed_files: C:/Users/data/test.csv: glob is: ["C:/Users/data/test.csv"]

So it appears to be running OK. This should be NUL, not /dev/null.

Made the change and let it run for almost 25 minutes and it was still printing the same loop. I can't imagine it's this slow since my file is only ~10KB.

It is going to keep printing that forever. The file input tails the file forever in case additional lines are written to it.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.