Logstash reads input files but does not write any output

Hi, I'm having problems with logstash.
Looks like it is capable of reaching and reading the files I want it to read but there is no sign of processing and output.
I'm trying to write both to elasticsearch and stdout but nothing comes out.
Even so, after the execution, I can find the sincedb file containing the file names I'm giving as input.

I'm just starting with the elastic stack so if someone could help me troubleshooting this I'd be very grateful.

Here is the pipeline configuration file content:

    input {

        file {
            path => "/opt/pulltweet/dumps/*.csv"
            type => "tweet"
        }

    }

    filter {
        csv {
            columns => ["tweet_id", "tweet_body", "published_time", "ref_tweets"]
            convert => {
                "published_time" => "date_time"
            }
        }

        json {
            source => "ref_tweets"
            target => "referenced_tweets"
        }

        mutate {
            remove_field => ["ref_tweets"]
        }

    }

    output {

        stdout {}

        elasticsearch {
            hosts => ["localhost:9200"]
            index => "%{type}_%{+YYYY.MM}"
        }

    }

Here is the output for the execution with --debug: debug output

Thank you in advance.

By default a file input will seek to the end of the file and then wait for new data to be written to it. If you want it to start at the beginning then set the start_position option.

Since you have already run logstash it will have recorded the size of the files in the sincedb and will not start at the beginning even if you set that option. You can use sincedb_path => "/dev/null" to force it to re-read things.

wait, before I run logstash I delete the sincedb file. Shouldn't logstash see those files for the first time and process anything that is already in the files before waiting for the new lines by default?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.