Hi,
Trying to load data into elastic search using logstash config. I am using 'stdin' input plugin for reading data from a file, here is the command:
/usr/share/logstash/bin/logstash -f logstashConfigFile < dataFile
logstashConfigFile contains:
input {
stdin {}
}
filter {
csv {
separator => " "
columns => ["id","address_id","value"]
}
mutate {convert => ["id", "integer"] }
mutate {convert => ["value", "float"] }
}
output {
elasticsearch {
hosts => ["localhost"]
action => "update"
doc_as_upsert => true
document_id => "%{id}"
index => "zcli_%{address_id}"
document_type => "adresss"
}
file {
path => "/home/tmp/logs/test.log"
}
}
DataFile have tab separated values. After the execution of the command, I noticed that total count of rows in the inputfile doesnot match to the total rows uploaded to elastic.
Suppose you have rows from 1 to 10000, then data is uploaded from 1 to 8000 and then rest last 2000 data missing in elastic. I am stuck at it, tried various things but nothing worked. Please help me out here.