Insert CSV to ES


(Tal) #1

Hi
I'm trying to insert a new CSV file to ES.
Got Successfully started Logstash API endpoint {:port=>9600}
But Nothing Happen.....It seems to me that Logstash uses a previous mapping I made for other file.. .
Is it ?
The file looks like this:
User_Id,Age,Gender,Occupation,Zip_Code
1,24,M,technician,85711
2,53,F,other,94043
3,23,M,writer,32067

My Config file looks like this:
input{
file{
path=>"/Users/office/Desktop/Elasticsearch data/ufo.csv"
start_position => "beginning"
sincedb_path => "/dev/null"
}
}
filter{
csv{
separator => ","
columns =>["User_Id","Age","Gender","Occupation","Zip_Code"]

	}
}

output{
elasticsearch{
hosts=>"localhost"
index=>"ufo"
document_type => "found"
}
stdout{}
}
I got sucssefule massage:
Sending Logstash's logs to C:/logstash/logs which is now configured via log4j2.properties
[2017-08-01T11:57:41,211][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://localhost:9200/]}}
[2017-08-01T11:57:41,238][INFO ][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>http://localhost:9200/, :path=>"/"}
[2017-08-01T11:57:41,396][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>#<URI::HTTP:0x4a4e9cc0 URL:http://localhost:9200/>}
[2017-08-01T11:57:41,398][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil}
[2017-08-01T11:57:41,476][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-", "version"=>50001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"default"=>{"_all"=>{"enabled"=>true, "norms"=>false}, "dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword"}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date", "include_in_all"=>false}, "@version"=>{"type"=>"keyword", "include_in_all"=>false}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
[2017-08-01T11:57:41,485][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>[#<URI::Generic:0x4a7938dd URL://localhost>]}
[2017-08-01T11:57:41,495][INFO ][logstash.pipeline ] Starting pipeline {"id"=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>5, "pipeline.max_inflight"=>500}
[2017-08-01T11:57:42,322][INFO ][logstash.pipeline ] Pipeline main started
[2017-08-01T11:57:42,596][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}


(system) #2

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.