I'm working on CentOS 7, this is my configuration file for Logstash:
input {
file {
path => "/etc/logstash/files/Chile.csv"
start_position => "beginning"
}
}
filter {
csv {
columns => ["ID","region","population","sex","age","education","income","statusquo","vote"]
separator => ","
}
}
output {
stdout { codec => json_lines }
elasticsearch {
"hosts" => "http://10.33.32.194:9200"
"index" => "elecciones"
}
}
And this is the response:
Feb 05 15:01:46 caas-lv3-web194 logstash[7719]: [2019-02-05T15:01:46,926][INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>6}
Feb 05 15:01:46 caas-lv3-web194 logstash[7719]: [2019-02-05T15:01:46,929][WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: thetype
event field won't be used to determine the document _type {:es_version=>6}
Feb 05 15:01:46 caas-lv3-web194 logstash[7719]: [2019-02-05T15:01:46,963][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::Elasticsearch", :hosts=>["http://10.33.32.194:9200"]}
Feb 05 15:01:47 caas-lv3-web194 logstash[7719]: [2019-02-05T15:01:47,010][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil}
Feb 05 15:01:47 caas-lv3-web194 logstash[7719]: [2019-02-05T15:01:47,030][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"default"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
Feb 05 15:01:47 caas-lv3-web194 logstash[7719]: [2019-02-05T15:01:47,355][INFO ][logstash.inputs.file ] No sincedb_path set, generating one based on the "path" setting {:sincedb_path=>"/var/lib/logstash/plugins/inputs/file/.sincedb_a853088c41dbc64d8b4fd23b46272715", :path=>["/etc/logstash/files/Chile.csv"]}
Feb 05 15:01:47 caas-lv3-web194 logstash[7719]: [2019-02-05T15:01:47,402][INFO ][logstash.pipeline ] Pipeline started successfully {:pipeline_id=>"main", :thread=>"#<Thread:0x33f1a1a2 run>"}
Feb 05 15:01:47 caas-lv3-web194 logstash[7719]: [2019-02-05T15:01:47,484][INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>}
Feb 05 15:01:47 caas-lv3-web194 logstash[7719]: [2019-02-05T15:01:47,515][INFO ][filewatch.observingtail ] START, creating Discoverer, Watch with file and sincedb collections
Feb 05 15:01:48 caas-lv3-web194 logstash[7719]: [2019-02-05T15:01:48,057][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
And it does not import the data from the file.