Logstash Config works fine but data not populated in elasticsearch

I'm configuring a csv file via logstash and need the ouptut loaded in elasticsearch. Please see below config file details also the commands executed so far.. The terminal gets stuck after "Logsatsh startup completed"..

root@karthikvalluri:/opt/logstash# bin/logstash --configtest -f Appletest.conf
Configuration OK
root@karthikvalluri:/opt/logstash# opt/bin/logstash -f Appletest.conf-bash: opt/bin/logstash: No such file or directory
root@karthikvalluri:/opt/logstash# bin/logstash -f Appletest.conf
Settings: Default pipeline workers: 4
Logstash startup completed

Config File:-
input {
file {
path => "/opt/logstash/yahoostock.csv"
type => "core2"
start_position => "beginning"
ignore_older => 0
}
}
filter {
csv {
columns => ["Date","Open","High","Low","Close","Volume","Adj Close"]
separator => ","
}

mutate{
convert => { "High" => "float"}
convert => { "Open" => "float"}
convert => { "Low" => "float"}
convert => { "Close" => "float"}
convert => { "Volume" => "float"}
}

}

output {

elasticsearch {

    action => "index"
    hosts => "localhost"
    index => "Yahoo"
    workers => 4

}
}

I tried searching for older posts but without any success.

Logstash is probably tailing yahoostock.csv and waiting for new data to be added. Read up on sincedb in the file input documentation and the dozens and dozens of threads here that cover this topic.

I tried adding the lines:-
sincedb_path => "/persistent/log"
sincedb_write_interval => 10
The data loads now but again gets stuck...see screenshot.

Please don't post screenshots when you can copy & paste.

Has all data been loaded? Logstash will not quit just because it reaches the end of the input file.

Ok..No when i refresh head (http://localhost:9200/_plugin/head/) i do not see my index..

What does your configuration look like? The configuration you posted earlier can't be what you're using now since your screenshot indicates that you're using a stdout output.

input {
file {
path => "/opt/logstash/yahoostock.csv"
type => "core2"
start_position => "beginning"
ignore_older => 0
sincedb_path => "/persistent/log"
sincedb_write_interval => 10
}
}
filter {
csv {
columns => ["Date","Open","High","Low","Close","Volume","Adj Close"]
separator => ","
}

mutate{
convert => { "High" => "float"}
convert => { "Open" => "float"}
convert => { "Low" => "float"}
convert => { "Close" => "float"}
convert => { "Volume" => "float"}
}

}

output {

elasticsearch {

    action => "index"
    hosts => "localhost"
    index => "Yahoo"
    workers => 4

}
stdout{}
}

I think index names have to be lowercase-only.

Thanks a lot.. :slight_smile: It works. I see the data in head.