Can't import data to elasticsearch from csv, stops at "Successfully started Logstash API endpoint {:port=>9600}"

I'm working on CentOS 7, this is my configuration file for Logstash:

input {
file {
path => "/etc/logstash/files/Chile.csv"
start_position => "beginning"
}
}
filter {
csv {
columns => ["ID","region","population","sex","age","education","income","statusquo","vote"]
separator => ","
}
}
output {
stdout { codec => json_lines }
elasticsearch {
"hosts" => "http://10.33.32.194:9200"
"index" => "elecciones"
}
}

And this is the response:

Feb 05 15:01:46 caas-lv3-web194 logstash[7719]: [2019-02-05T15:01:46,926][INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>6}
Feb 05 15:01:46 caas-lv3-web194 logstash[7719]: [2019-02-05T15:01:46,929][WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: the type event field won't be used to determine the document _type {:es_version=>6}
Feb 05 15:01:46 caas-lv3-web194 logstash[7719]: [2019-02-05T15:01:46,963][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["http://10.33.32.194:9200"]}
Feb 05 15:01:47 caas-lv3-web194 logstash[7719]: [2019-02-05T15:01:47,010][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil}
Feb 05 15:01:47 caas-lv3-web194 logstash[7719]: [2019-02-05T15:01:47,030][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"default"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
Feb 05 15:01:47 caas-lv3-web194 logstash[7719]: [2019-02-05T15:01:47,355][INFO ][logstash.inputs.file ] No sincedb_path set, generating one based on the "path" setting {:sincedb_path=>"/var/lib/logstash/plugins/inputs/file/.sincedb_a853088c41dbc64d8b4fd23b46272715", :path=>["/etc/logstash/files/Chile.csv"]}
Feb 05 15:01:47 caas-lv3-web194 logstash[7719]: [2019-02-05T15:01:47,402][INFO ][logstash.pipeline ] Pipeline started successfully {:pipeline_id=>"main", :thread=>"#<Thread:0x33f1a1a2 run>"}
Feb 05 15:01:47 caas-lv3-web194 logstash[7719]: [2019-02-05T15:01:47,484][INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>}
Feb 05 15:01:47 caas-lv3-web194 logstash[7719]: [2019-02-05T15:01:47,515][INFO ][filewatch.observingtail ] START, creating Discoverer, Watch with file and sincedb collections
Feb 05 15:01:48 caas-lv3-web194 logstash[7719]: [2019-02-05T15:01:48,057][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}

And it does not import the data from the file.

My guess is that there is a sincedb that is telling it it has already read the file, and it is waiting for data to be appended to it. Try adding

 sincedb_path => "/dev/null"

to the file input. If that does not help then enable "--log.level trace" and see what filewatch has to say.

I actually deleted that part, because it was something I was just trying. But is actually the same thing if I run it without it. Thx for replaying

I tried adding sincedb_path => "/dev/null" and it made it not even run, it just restarted every 2 minutes.

Does it log an error?

it worked :slight_smile: thx i write it the wrong way, but adding the sincedb_path made it work, ty

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.