Hi,
I create a file.config for import csv file to elasticsearch but the connection between ES and Logstash is warning . I have this Warn:
> Sending Logstash's logs to C:/Project/elk/logstash/logs which is now configured
> via log4j2.properties
> [2017-04-27T09:48:51,839][INFO ][logstash.outputs.elasticsearch] Elasticsearch p
> ool URLs updated {:changes=>{:removed=>[], :added=>["http://localhost:9200"]}}
> [2017-04-27T09:48:51,839][INFO ][logstash.outputs.elasticsearch] Running health
> check to see if an Elasticsearch connection is working {:url=>#<URI::HTTP:0x30f6
> 9343 URL:http://localhost:9200>, :healthcheck_path=>"/"}
> [2017-04-27T09:48:51,939][WARN ][logstash.outputs.elasticsearch] Restored connec
> tion to ES instance {:url=>#<URI::HTTP:0x30f69343 URL:http://localhost:9200>}
> [2017-04-27T09:48:51,939][INFO ][logstash.outputs.elasticsearch] Using mapping t
> emplate from {:path=>nil}
> [2017-04-27T09:48:51,999][INFO ][logstash.outputs.elasticsearch] Attempting to i
> nstall template {:manage_template=>{"template"=>"logstash-*", "version"=>50001,
> "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"_default_"=>{"_all"=
> >{"enabled"=>true, "norms"=>false}, "dynamic_templates"=>[{"message_field"=>{"pa
> th_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text"
> , "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"str
> ing", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=
> >"keyword"}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date", "include_in_all"
> =>false}, "@version"=>{"type"=>"keyword", "include_in_all"=>false}, "geoip"=>{"d
> ynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_po
> int"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}
> }}}}}
> [2017-04-27T09:48:52,009][INFO ][logstash.outputs.elasticsearch] New Elasticsear
> ch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["http://localhos
> t:9200"]}
> [2017-04-27T09:48:52,014][INFO ][logstash.pipeline ] Starting pipeline {"
> id"=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.
> delay"=>5, "pipeline.max_inflight"=>500}
> [2017-04-27T09:48:52,019][INFO ][logstash.pipeline ] Pipeline main starte
> d
> [2017-04-27T09:48:52,109][INFO ][logstash.agent ] Successfully started
> Logstash API endpoint {:port=>9600}
This is my file.config:
input {
file {
path => "/Users/salma/Desktop/creditcard.csv"
start_position => "beginning"
sincedb_path => "/dev/null"
}
}
filter {
csv {
separator => ","
columns => ["Time","V1","V2","V3","V4","V5","V6","V7","V8","V9","V10","V11","V12","V13","V14","V15","V16","V17","V18","V19","V20","V21","V22","V23","V24","V25","V26","V27","V28","Amount"]
remove_field => ["class"]
}
}
output {
elasticsearch {
hosts => [ "http://localhost:9200" ]
index => "dataset"
sniffing => false
}
stdout { codec => rubydebug }
}
I use ELK 5.2.1
Can help please
Thanks