I am using logstash to feed my CSV file into my elasticsearch , my logstash startup completed but the files
didn't load into the index.
I try checking by using GET /_cat/indices?v
health status index pri rep docs.count docs.deleted store.size pri.store.size
green open .marvel-es-data 1 1 3 0 7.8kb 3.9kb
green open .marvel-es-2016.03.11 1 1 801 32 584.8kb 278.1kb
green open .kibana 1 1 2 0 29.9kb 14.9kb
green open xxxxx 5 1 0 0 1.5kb 795b
Thanks Mark.
It started when I copy and paste all the data after starting the logstash.
But in Kibana i do not see any of the field names, as i am getting the error " _csvparsefailure"
I created my index in elasticsearch,
PUT /approved
{
"mappings" : {
"default" : {
"properties" : {
"Name and Appl #" : {"type": "string", "index" : "not_analyzed" },
"Suppl #" : {"type": "integer", "index" : "not_analyzed" },
"Ace Ingrs":{"type": "string", "index" : "not_analyzed" },
"Compy #" : {"type": "string", "index" : "not_analyzed" },
"Appral Type" : {"type": "string", "index" : "not_analyzed" },
"appral date" : {"type": "date", "format" : "yyyy-MM-dd" }
}
}
}
}
;
I provided my config file. Can you please help me in resolve this.
Start by removing the elasticsearch output and swapping the file input for stdin { codec => rubydebug}.
Then run LS and push through a line of your file and check the output, then make changes and go from there.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.