[ERROR][logstash.config.sourceloader] No configuration found in the configured sources

Hi Masters,

Please help me with this small query. Trying to load excel (Csv) to Kibana from logstash. I have not changed/ created any YML .
geting the below error -

      Z:\>logstash -f logstash_tc.conf --debug
            Sending Logstash logs to Z:/logstash-7.6.2/logstash-7.6.2/logs which is now configured via log4j2.properties
            [2020-08-08T03:16:21,834][DEBUG][logstash.modules.scaffold] Found module {:module_name=>"fb_apache", :directory=>"Z:/logstash-7.6.2/logstash-7.6.2/modules/fb_apache/configuration"}
            [2020-08-08T03:16:21,956][DEBUG][logstash.plugins.registry] Adding plugin to the registry {:name=>"fb_apache", :type=>:modules, :class=>#<LogStash::Modules::Scaffold:0x13d305e4 @directory="Z:/logstash-7.6.2/logstash-7.6.2/modules/fb_apache/configuration", @module_name="fb_apache", @kibana_version_parts=["6", "0", "0"]>}
            [2020-08-08T03:16:21,963][DEBUG][logstash.modules.scaffold] Found module {:module_name=>"netflow", :directory=>"Z:/logstash-7.6.2/logstash-7.6.2/modules/netflow/configuration"}
            [2020-08-08T03:16:21,967][DEBUG][logstash.plugins.registry] Adding plugin to the registry {:name=>"netflow", :type=>:modules, :class=>#<LogStash::Modules::Scaffold:0x615dbdd0 @directory="Z:/logstash-7.6.2/logstash-7.6.2/modules/netflow/configuration", @module_name="netflow", @kibana_version_parts=["6", "0", "0"]>}
            [2020-08-08T03:16:22,095][DEBUG][logstash.runner          ] -------- Logstash Settings (* means modified) ---------
            [2020-08-08T03:16:22,100][DEBUG][logstash.runner          ] node.name: "LAPTOP-OFODOV4N"
            [2020-08-08T03:16:22,103][DEBUG][logstash.runner          ] *path.config: "logstash_tc.conf"
            [2020-08-08T03:16:22,107][DEBUG][logstash.runner          ] path.data: "Z:/logstash-7.6.2/logstash-7.6.2/data"
            [2020-08-08T03:16:22,110][DEBUG][logstash.runner          ] modules.cli: []
            [2020-08-08T03:16:22,113][DEBUG][logstash.runner          ] modules: []
            [2020-08-08T03:16:22,117][DEBUG][logstash.runner          ] modules_list: []
            [2020-08-08T03:16:22,120][DEBUG][logstash.runner          ] modules_variable_list: []
            [2020-08-08T03:16:22,131][DEBUG][logstash.runner          ] modules_setup: false
            [2020-08-08T03:16:22,133][DEBUG][logstash.runner          ] config.test_and_exit: false
            [2020-08-08T03:16:22,191][DEBUG][logstash.runner          ] config.debug: false
            [2020-08-08T03:16:22,194][DEBUG][logstash.runner          ] *log.level: "debug" (default: "info")
            [2020-08-08T03:16:22,198][DEBUG][logstash.runner          ] version: false
            [2020-08-08T03:16:22,202][DEBUG][logstash.runner          ] help: false
            [2020-08-08T03:16:22,207][DEBUG][logstash.runner          ] log.format: "plain"
            [2020-08-08T03:16:22,211][DEBUG][logstash.runner          ] http.host: "127.0.0.1"
            [2020-08-08T03:16:22,214][DEBUG][logstash.runner          ] http.port: 9600..9700
            [2020-08-08T03:16:22,218][DEBUG][logstash.runner          ] http.environment: "production"
            [2020-08-08T03:16:22,222][DEBUG][logstash.runner          ] queue.type: "memory"
            [2020-08-08T03:16:22,227][DEBUG][logstash.runner          ] queue.drain: false
            [2020-08-08T03:16:22,231][DEBUG][logstash.runner          ] queue.page_capacity: 67108864
            [2020-08-08T03:16:22,236][DEBUG][logstash.runner          ] queue.max_bytes: 1073741824
            [2020-08-08T03:16:22,241][DEBUG][logstash.runner          ] queue.max_events: 0
            [2020-08-08T03:16:22,244][DEBUG][logstash.runner          ] queue.checkpoint.acks: 1024
            [2020-08-08T03:16:22,248][DEBUG][logstash.runner          ] queue.checkpoint.writes: 1024
            [2020-08-08T03:16:22,251][DEBUG][logstash.runner          ] queue.checkpoint.interval: 1000
            [2020-08-08T03:16:22,255][DEBUG][logstash.runner          ] queue.checkpoint.retry: false
            [2020-08-08T03:16:22,258][DEBUG][logstash.runner          ] dead_letter_queue.enable: false
            [2020-08-08T03:16:22,263][DEBUG][logstash.runner          ] dead_letter_queue.max_bytes: 1073741824
            [2020-08-08T03:16:22,266][DEBUG][logstash.runner          ] slowlog.threshold.warn: -1
            [2020-08-08T03:16:22,270][DEBUG][logstash.runner          ] slowlog.threshold.info: -1
            [2020-08-08T03:16:22,277][DEBUG][logstash.runner          ] slowlog.threshold.debug: -1
            [2020-08-08T03:16:22,281][DEBUG][logstash.runner          ] slowlog.threshold.trace: -1
            [2020-08-08T03:16:22,284][DEBUG][logstash.runner          ] keystore.classname: "org.logstash.secret.store.backend.JavaKeyStore"
            [2020-08-08T03:16:22,287][DEBUG][logstash.runner          ] keystore.file: "Z:/logstash-7.6.2/logstash-7.6.2/config/logstash.keystore"
            [2020-08-08T03:16:22,353][DEBUG][logstash.runner          ] path.queue: "Z:/logstash-7.6.2/logstash-7.6.2/data/queue"
            [2020-08-08T03:16:22,357][DEBUG][logstash.runner          ] path.dead_letter_queue: "Z:/logstash-7.6.2/logstash-7.6.2/data/dead_letter_queue"
            [2020-08-08T03:16:22,360][DEBUG][logstash.runner          ] path.settings: "Z:/logstash-7.6.2/logstash-7.6.2/config"
            [2020-08-08T03:16:22,365][DEBUG][logstash.runner          ] path.logs: "Z:/logstash-7.6.2/logstash-7.6.2/logs"
            [2020-08-08T03:16:22,368][DEBUG][logstash.runner          ] xpack.management.enabled: false
            [2020-08-08T03:16:22,372][DEBUG][logstash.runner          ] xpack.management.logstash.poll_interval: 5000000000
            [2020-08-08T03:16:22,378][DEBUG][logstash.runner          ] xpack.management.pipeline.id: ["main"]
            [2020-08-08T03:16:22,382][DEBUG][logstash.runner          ] xpack.management.elasticsearch.username: "logstash_system"
            [2020-08-08T03:16:22,384][DEBUG][logstash.runner          ] xpack.management.elasticsearch.hosts: ["https://localhost:9200"]
            [2020-08-08T03:16:22,387][DEBUG][logstash.runner          ] xpack.management.elasticsearch.ssl.verification_mode: "certificate"
            [2020-08-08T03:16:22,393][DEBUG][logstash.runner          ] xpack.management.elasticsearch.sniffing: false
            [2020-08-08T03:16:22,396][DEBUG][logstash.runner          ] xpack.monitoring.enabled: false
            [2020-08-08T03:16:22,400][DEBUG][logstash.runner          ] xpack.monitoring.elasticsearch.hosts: ["http://localhost:9200"]
            [2020-08-08T03:16:22,404][DEBUG][logstash.runner          ] xpack.monitoring.collection.interval: 10000000000
            [2020-08-08T03:16:22,410][DEBUG][logstash.runner          ] xpack.monitoring.collection.timeout_interval: 600000000000
            [2020-08-08T03:16:22,413][DEBUG][logstash.runner          ] xpack.monitoring.elasticsearch.username: "logstash_system"
            [2020-08-08T03:16:22,416][DEBUG][logstash.runner          ] xpack.monitoring.elasticsearch.ssl.verification_mode: "certificate"
            [2020-08-08T03:16:22,420][DEBUG][logstash.runner          ] xpack.monitoring.elasticsearch.sniffing: false
            [2020-08-08T03:16:22,425][DEBUG][logstash.runner          ] xpack.monitoring.collection.pipeline.details.enabled: true
            [2020-08-08T03:16:22,432][DEBUG][logstash.runner          ] xpack.monitoring.collection.config.enabled: true
            [2020-08-08T03:16:22,436][DEBUG][logstash.runner          ] node.uuid: ""
            [2020-08-08T03:16:22,439][DEBUG][logstash.runner          ] --------------- Logstash Settings -------------------
            [2020-08-08T03:16:22,496][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
            [2020-08-08T03:16:22,509][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"7.6.2"}
            [2020-08-08T03:16:22,578][DEBUG][logstash.agent           ] Setting up metric collection
            [2020-08-08T03:16:22,671][DEBUG][logstash.instrument.periodicpoller.os] Starting {:polling_interval=>5, :polling_timeout=>120}
            [2020-08-08T03:16:22,691][DEBUG][logstash.instrument.periodicpoller.cgroup] One or more required cgroup files or directories not found: /proc/self/cgroup, /sys/fs/cgroup/cpuacct, /sys/fs/cgroup/cpu
            [2020-08-08T03:16:22,870][DEBUG][logstash.instrument.periodicpoller.jvm] Starting {:polling_interval=>5, :polling_timeout=>120}
            [2020-08-08T03:16:23,030][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
            [2020-08-08T03:16:23,042][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
            [2020-08-08T03:16:23,067][DEBUG][logstash.instrument.periodicpoller.persistentqueue] Starting {:polling_interval=>5, :polling_timeout=>120}
            [2020-08-08T03:16:23,091][DEBUG][logstash.instrument.periodicpoller.deadletterqueue] Starting {:polling_interval=>5, :polling_timeout=>120}
            [2020-08-08T03:16:23,166][DEBUG][logstash.agent           ] Starting agent
            [2020-08-08T03:16:23,262][DEBUG][logstash.config.source.local.configpathloader] Skipping the following files while reading config since they don't match the specified glob pattern {:files=>["Z:/$RECYCLE.BIN", "Z:/GTF_Hack", "Z:/Movies", "Z:/Project 2017", "Z:/Projects", "Z:/System Volume Information", "Z:/Windows10Upgrade", "Z:/[localhost", "Z:/clean_data.csv", "Z:/elasticsearch-7.6.2-windows-x86_64", "Z:/elasticsearch-7.6.2-windows-x86_64.zip", "Z:/kibana-7.6.2-windows-x86_64", "Z:/kibana-7.6.2-windows-x86_64.zip", "Z:/logstash-7.6.2", "Z:/logstash-7.6.2.zip", "Z:/logstash.conf", "Z:/music"]}
            [2020-08-08T03:16:23,269][INFO ][logstash.config.source.local.configpathloader] No config files found in path {:path=>"Z:/logstash_tc.conf"}
            **[2020-08-08T03:16:23,282][ERROR][logstash.config.sourceloader] No configuration found in the configured sources.**
            [2020-08-08T03:16:23,313][DEBUG][logstash.agent           ] Converging pipelines state {:actions_count=>0}
            [2020-08-08T03:16:23,360][DEBUG][logstash.agent           ] Starting puma
            [2020-08-08T03:16:23,386][DEBUG][logstash.instrument.periodicpoller.os] Stopping
            [2020-08-08T03:16:23,396][DEBUG][logstash.agent           ] Trying to start WebServer {:port=>9600}
            [2020-08-08T03:16:23,420][DEBUG][logstash.instrument.periodicpoller.jvm] Stopping
            [2020-08-08T03:16:23,459][DEBUG][logstash.instrument.periodicpoller.persistentqueue] Stopping
            [2020-08-08T03:16:23,466][DEBUG][logstash.instrument.periodicpoller.deadletterqueue] Stopping
            [2020-08-08T03:16:23,492][DEBUG][logstash.agent           ] Shutting down all pipelines {:pipelines_count=>0}
            [2020-08-08T03:16:23,506][DEBUG][logstash.agent           ] Converging pipelines state {:actions_count=>0}
            [2020-08-08T03:16:23,534][DEBUG][logstash.api.service     ] [api-service] start
            [2020-08-08T03:16:23,923][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}
            [2020-08-08T03:16:28,693][INFO ][logstash.runner          ] Logstash shut down.

logstash_tc.conf

    file{
    path => "Z:\GTF_Hack\GTF_Hack_result.csv"
    start_position => "beginning"
   
    }
}
filter {
    csv {
        separator => ","
        columns => ["TestCase_no","Execution_Status","Sprint"]
        }
}
output {
    elasticsearch{
        hosts => ["localhost:9200"]
        index => "testcases"
        document_type => "Tc"
    }
}```

You have not included a directory path in path.config so it is looking in the root directory. Is that where the configuration file is?

Do not use backslash in the path option of a file input, it is treated as an escape. Use forward slash.

2 Likes

Omg !! Changed the path with back slash and It worked for the first time! you are God Like! thanks...

Its 4am in India- Seriously did not expect so quick resolution

2 Likes

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.