Logstash: Problem regarding inout plugin, Logstash startup completed does not get dislayed

I am using Macbook and trying to use Logstash. I am using the following config file:

input {
file {
path => "/Desktop/data.csv"
type => "core2"
start_position => "beginning"
}
}
filter {
csv {
separator => ","
columns => ["Date","Open","High","Low","Close","Volume","Adj Close"]
}
mutate {convert => ["High", "float"]}
mutate {convert => ["Open", "float"]}
mutate {convert => ["Low", "float"]}
mutate {convert => ["Close", "float"]}
mutate {convert => ["Volume", "float"]}
}
output {
elasticsearch {
action => "index"
hosts => "localhost"
index => "stock"
workers => 1
}
stdout {}
}

After this I run $ bin/logstash -f logstash1.conf

And this is what I get:

Settings: Default pipeline workers: 4
Pipeline main started

I am stuck after this. Lostash startup complete mesage doesn't get displayed. Kindly help me.

Is the modification time of data.csv older than 24 hours? Adjust the file input's ignore_older option.

If not Logstash is probably tailing the file. Read about sincedb and check out the archives for details. It's an extremely common issue.

685b358426d6:logstash-2.3.2 paritp$ ./bin/logstash -f logstash1.conf -v
starting agent {:level=>:info}
starting pipeline {:id=>"main", :level=>:info}
Settings: Default pipeline workers: 4
Registering file input {:path=>["/Desktop/data.csv"], :level=>:info}
No sincedb_path set, generating one based on the file path {:sincedb_path=>"/Users/paritp/.sincedb_e51202a64a3aa13dd9ce7c8b25992f3b", :path=>["/Desktop/data.csv"], :level=>:info}
Using mapping template from {:path=>nil, :level=>:info}
Attempting to install template {:manage_template=>{"template"=>"logstash-", "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"default"=>{"_all"=>{"enabled"=>true, "omit_norms"=>true}, "dynamic_templates"=>[{"message_field"=>{"match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"string", "index"=>"analyzed", "omit_norms"=>true, "fielddata"=>{"format"=>"disabled"}}}}, {"string_fields"=>{"match"=>"", "match_mapping_type"=>"string", "mapping"=>{"type"=>"string", "index"=>"analyzed", "omit_norms"=>true, "fielddata"=>{"format"=>"disabled"}, "fields"=>{"raw"=>{"type"=>"string", "index"=>"not_analyzed", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"string", "index"=>"not_analyzed"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"float"}, "longitude"=>{"type"=>"float"}}}}}}}, :level=>:info}
New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["localhost"], :level=>:info}
Starting pipeline {:id=>"main", :pipeline_workers=>4, :batch_size=>125, :batch_delay=>5, :max_inflight=>500, :level=>:info}
Pipeline main started

This is what I get when I run it on verbose, please tell me what to do next.
I have installed the input file plugin also and the following is my config file:

input {
file {
path => "/Desktop/data.csv"
type => "core2"
start_position => "beginning"
ignore_older => 0
}
}
filter {
csv {
separator => ","
columns => ["Date","Open","High","Low","Close","Volume","Adj Close"]
}
mutate {convert => ["High", "float"]}
mutate {convert => ["Open", "float"]}
mutate {convert => ["Low", "float"]}
mutate {convert => ["Close", "float"]}
mutate {convert => ["Volume", "float"]}
}
output {
elasticsearch {
action => "index"
hosts => "localhost"
index => "stock"
workers => 1
}
stdout {}
}

earlier I was making a mistake, there was some discrepancy with the path I guess. Now I am able to see the the output on the console, but still for some reason, Logstash does not get completed.

I dont get the response :
Logstash startup completed

Please help.

That's expected. The file input plugin is waiting for more data. Its main use case is continuous monitoring of log files.

Currently, I always have to go back to the directory containing .sincedb file and delete it. after that only the output gets displayed on the console. Is there a way to avoid this doing repetitively?

And I read earlier in some discussion where some guy was also using the input plugin. But in his case "Logstash startup complete" was getting displayed. Could you please help me?

Currently, I always have to go back to the directory containing .sincedb file and delete it. after that only the output gets displayed on the console. Is there a way to avoid this doing repetitively?

Setting the file input's sincedb_path parameter to /dev/null (assuming you're on a Unix-like OS) will effectively disable sincedb.

thanks a lot Magnus. Really appreciate your help. :slight_smile:

Could you also resolve the other problem of "Logstash startup complete" not getting resolved also.

Logstash 2.3 and later doesn't say "Logstash startup completed" but "Pipeline main started". I've filed https://github.com/elastic/logstash/issues/5372 to fix the documentation.

I have the exactly the same issue with you - no any output and can not view by Kibana . Can you show m your latest configuraiton file?

Below is my logstash conf file:
input {
file {
path => "/test/table.csv"
start_position => "beginning"
ignore_older => 0
type => "core2"
sincedb_path => "/dev/null"
}
}

filter {
csv {
separator => ","
columns => ["Date", "Open", "High", "Low", "Close", "Volume", "Adj Closee"]
}

mutate { convert => ["High", "float"] }
mutate { convert => ["Open", "float"] }
mutate { convert => ["Low", "float"] }
mutate { convert => ["Close", "float"]}
mutate { convert => ["Volume", "float" ] }

}

output {
elasticsearch {
action => "index"
hosts => "localhost"
index => "stock"
workers => 1

}

stdout { }

}

console:
spark:bin imsparkpan$ ./logstash -f logstash-file.conf -v
starting agent {:level=>:info}
starting pipeline {:id=>"main", :level=>:info}
Settings: Default pipeline workers: 4
Registering file input {:path=>["/test/table.csv"], :level=>:info}
Using mapping template from {:path=>nil, :level=>:info}
Attempting to install template {:manage_template=>{"template"=>"logstash-", "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"default"=>{"_all"=>{"enabled"=>true, "omit_norms"=>true}, "dynamic_templates"=>[{"message_field"=>{"match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"string", "index"=>"analyzed", "omit_norms"=>true, "fielddata"=>{"format"=>"disabled"}}}}, {"string_fields"=>{"match"=>"", "match_mapping_type"=>"string", "mapping"=>{"type"=>"string", "index"=>"analyzed", "omit_norms"=>true, "fielddata"=>{"format"=>"disabled"}, "fields"=>{"raw"=>{"type"=>"string", "index"=>"not_analyzed", "doc_values"=>true, "ignore_above"=>256}}}}}, {"float_fields"=>{"match"=>"", "match_mapping_type"=>"float", "mapping"=>{"type"=>"float", "doc_values"=>true}}}, {"double_fields"=>{"match"=>"", "match_mapping_type"=>"double", "mapping"=>{"type"=>"double", "doc_values"=>true}}}, {"byte_fields"=>{"match"=>"", "match_mapping_type"=>"byte", "mapping"=>{"type"=>"byte", "doc_values"=>true}}}, {"short_fields"=>{"match"=>"", "match_mapping_type"=>"short", "mapping"=>{"type"=>"short", "doc_values"=>true}}}, {"integer_fields"=>{"match"=>"", "match_mapping_type"=>"integer", "mapping"=>{"type"=>"integer", "doc_values"=>true}}}, {"long_fields"=>{"match"=>"", "match_mapping_type"=>"long", "mapping"=>{"type"=>"long", "doc_values"=>true}}}, {"date_fields"=>{"match"=>"", "match_mapping_type"=>"date", "mapping"=>{"type"=>"date", "doc_values"=>true}}}, {"geo_point_fields"=>{"match"=>"", "match_mapping_type"=>"geo_point", "mapping"=>{"type"=>"geo_point", "doc_values"=>true}}}], "properties"=>{"@timestamp"=>{"type"=>"date", "doc_values"=>true}, "@version"=>{"type"=>"string", "index"=>"not_analyzed", "doc_values"=>true}, "geoip"=>{"type"=>"object", "dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip", "doc_values"=>true}, "location"=>{"type"=>"geo_point", "doc_values"=>true}, "latitude"=>{"type"=>"float", "doc_values"=>true}, "longitude"=>{"type"=>"float", "doc_values"=>true}}}}}}}, :level=>:info}
New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["localhost"], :level=>:info}
Starting pipeline {:id=>"main", :pipeline_workers=>4, :batch_size=>125, :batch_delay=>5, :max_inflight=>500, :level=>:info}
Pipeline main started

I am digging through the forums and keep seeing this error "attempting to install template" and unable to load files. I have posted (another) thread for same issue, was this resolved?

The "attempting to install template" message is totally normal and had nothing to do with the problems described in this thread.

thanks for clarifying @magnusbaeck - so this is not a template file I need to locate? (main issue is solved in the other thread)

No, there's no file for you to locate.