Nothing view in elasticsearch and kibana when import csv file


(salma) #1

Hi,
I create a file.config for import csv file to elasticsearch but the connection between ES and Logstash is warning . I have this Warn:

> Sending Logstash's logs to C:/Project/elk/logstash/logs which is now configured
> via log4j2.properties
> [2017-04-27T09:48:51,839][INFO ][logstash.outputs.elasticsearch] Elasticsearch p
> ool URLs updated {:changes=>{:removed=>[], :added=>["http://localhost:9200"]}}
> [2017-04-27T09:48:51,839][INFO ][logstash.outputs.elasticsearch] Running health
> check to see if an Elasticsearch connection is working {:url=>#<URI::HTTP:0x30f6
> 9343 URL:http://localhost:9200>, :healthcheck_path=>"/"}
> [2017-04-27T09:48:51,939][WARN ][logstash.outputs.elasticsearch] Restored connec
> tion to ES instance {:url=>#<URI::HTTP:0x30f69343 URL:http://localhost:9200>}
> [2017-04-27T09:48:51,939][INFO ][logstash.outputs.elasticsearch] Using mapping t
> emplate from {:path=>nil}
> [2017-04-27T09:48:51,999][INFO ][logstash.outputs.elasticsearch] Attempting to i
> nstall template {:manage_template=>{"template"=>"logstash-*", "version"=>50001,
> "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"_default_"=>{"_all"=
> >{"enabled"=>true, "norms"=>false}, "dynamic_templates"=>[{"message_field"=>{"pa
> th_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text"
> , "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"str
> ing", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=
> >"keyword"}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date", "include_in_all"
> =>false}, "@version"=>{"type"=>"keyword", "include_in_all"=>false}, "geoip"=>{"d
> ynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_po
> int"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}
> }}}}}
> [2017-04-27T09:48:52,009][INFO ][logstash.outputs.elasticsearch] New Elasticsear
> ch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["http://localhos
> t:9200"]}
> [2017-04-27T09:48:52,014][INFO ][logstash.pipeline        ] Starting pipeline {"
> id"=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.
> delay"=>5, "pipeline.max_inflight"=>500}
> [2017-04-27T09:48:52,019][INFO ][logstash.pipeline        ] Pipeline main starte
> d
> [2017-04-27T09:48:52,109][INFO ][logstash.agent           ] Successfully started
>  Logstash API endpoint {:port=>9600}

This is my file.config:

input {
  file {
    path => "/Users/salma/Desktop/creditcard.csv"
    start_position => "beginning"
    sincedb_path => "/dev/null"
  }
}
filter {
  csv {
     separator => ","
     columns => ["Time","V1","V2","V3","V4","V5","V6","V7","V8","V9","V10","V11","V12","V13","V14","V15","V16","V17","V18","V19","V20","V21","V22","V23","V24","V25","V26","V27","V28","Amount"]
     remove_field => ["class"]

  }
}
output {
   elasticsearch {
   hosts => [ "http://localhost:9200" ]
   index => "dataset"
   sniffing => false
       }
   stdout { codec => rubydebug }
  }

I use ELK 5.2.1
Can help please

Thanks


(Pablo) #2

Hi

That warning is just saying the connection was restored, probably lost before.
I see it every time I restart Logstash, but I think it should not really be a warning. I might be wrong though.

By the way, the title of the topic is long and a bit unreadable, you could edit it and put something like logstash warning meaning or so.


(salma) #3

Hi,
My problem is in connection between logstash and elasticsearch because nothing view when i import a csv file


(Pablo) #4

But can you see the output of the CSV in the stdout?

Open the csv file edit it, and save it to test, might be an old file and Logstash is not picking it up


(salma) #5

hi,
the problem maybe in windows because the config file run in linux the warn it is restored connection to ES instance.


#6

are you able to see the data in Kibana?


(salma) #7

nooo nor in kibana and elasticsearch


#8

Comment index line like below. Just replace this in ur configuration
output {
elasticsearch {
hosts => [ "http://localhost:9200" ]
#index => "dataset"
sniffing => false
}
stdout { codec => rubydebug }
}


#9

Kindly remove the double quotes from below line.

start_position => "beginning"


(salma) #10

The same problem :

09:15:32.521 [[main]-pipeline-manager] WARN  logstash.outputs.elasticsearch - Re
stored connection to ES instance {:url=>#<URI::HTTP:0x460208e0 URL:http://localh
ost:9200>}

#11

This message is okay...Its restoring the connection. after changes are you able to see the data in Kibana or not?

If not, Please check how many indices are having the elasticsearch.
If you are using Linux, Please execute the command below.

curl http://localhost:9200/_cat/indices.
Try to execute the below command and post complete log details here.

./logstash -f --debug


(salma) #12

nothing view in kibana and the localhost return this :
yellow open .kibana H3neU9_NSz6OQ64kyydhXA 1 1 1 0 3.1kb 3.1kb


(Álvaro Sanz Garrigues) #13

Salma are you working in a enterprise computer or with some restrictions of use?

I had a similar problem and I finally make it works. I was doing it in the enterprise laptop which is restricted. I create a Ubuntu VM where I have the whole control, and there with the same logstash configuration it finally works.
So I guess that in my enterprise computer there were permissions issues or something like that which denies logstash to work as expected.


#14

Yes, Because index not created.
Please execute my 2nd command which will give complete information so that we ca troubleshoot further.


(salma) #15

Hi ALvaro, I work in windows Os i test my file.conf in ubuntu it's work with the same data , i need to execute in windows because but i don't have any solution


#16

Ohh, you are facing issue with only in Windows....Sorry i thought you are facing the issue in Linux too.

I hope Alvaro give suggestions to resolve it windows environment.


(Álvaro Sanz Garrigues) #17

I can't resolve it on Windows. I solve it in Ubuntu.
Your Windows OS user has any restrictions or is a enterprise computer?
I'm not sure if it was the problem but my company has strong security setting on the devices and I guess that the problem was there, it produces conflicts between ELK and the security settings o restrictions of the OS user.


(salma) #18

he is a entreprise computer


(Álvaro Sanz Garrigues) #19

So probably the problem will be there, just build a Windows VM in your computer and try there.
I hope it solves your problems.


#20

Hi Salma,

Please change this from sincedb_path => "/dev/null" to sincedb_path => "NUL"

Linux -> sincedb_path => "/dev/null"
Windows -> sincedb_path => "NUL"

It should work

Regards
Raja