Spend days in this forum before posting. I have large csv file/conf file with mappings from previous ELK VM (no longer running) which worked great - alpha 5x. File permissions allow read/write to all ELK directories. This install is on local macbook Sierra 10.12.3...
logstash will not ingest my test.csv using test.conf "attempting to install template"
my test conf file
input {
file {
path => "/test.csv"
start_position => "beginning"
sincedb_path => "/dev/null"
}
}
filter {
csv {
separator => ","
columns => ["0","1","2"]
}
}
output {
elasticsearch {
hosts => "http://localhost:9200"
index => "test"
}
stdout {
codec => rubydebug
}
}
[2017-03-03T11:50:41,794][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil}
[2017-03-03T11:50:41,911][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-", "version"=>50001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"default"=>{"_all"=>{"enabled"=>true, "norms"=>false}, "dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword"}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date", "include_in_all"=>false}, "@version"=>{"type"=>"keyword", "include_in_all"=>false}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
[2017-03-03T11:50:41,920][DEBUG][logstash.outputs.elasticsearch] Found existing Elasticsearch template. Skipping template management {:name=>"logstash"}
[2017-03-03T11:50:41,921][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>[#<URI::HTTP:0x75fc9232 URL:http://localhost:9200>]}
[2017-03-03T11:50:41,922][DEBUG][logstash.filters.csv ] CSV parsing options {:col_sep=>",", :quote_char=>"""}
[2017-03-03T11:50:41,924][INFO ][logstash.pipeline ] Starting pipeline {"id"=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>5, "pipeline.max_inflight"=>500}
[2017-03-03T11:50:41,929][INFO ][logstash.pipeline ] Pipeline main started
[2017-03-03T11:50:41,935][DEBUG][logstash.agent ] Starting puma
[2017-03-03T11:50:41,937][DEBUG][logstash.agent ] Trying to start WebServer {:port=>9600}
[2017-03-03T11:50:41,940][DEBUG][logstash.api.service ] [api-service] start
[2017-03-03T11:50:42,026][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
[2017-03-03T11:50:46,932][DEBUG][logstash.pipeline ] Pushing flush onto pipeline
[2017-03-03T11:50:51,936][DEBUG][logstash.pipeline ] Pushing flush onto pipeline
[2017-03-03T11:50:55,360][DEBUG][logstash.inputs.file ] _globbed_files: /test.csv: glob is: []