Cant append documents on existing index

(Nishant Verma) #1


We are using logstash to sink kafka data to elastic search.

When we are trying to insert docs on a non-existing index on elastic search, then logstash creates the index and starts docs there.
But when we try to do the same on an already existing index, then nothing happens. No error on logstash console. Keeps on sending "2017-05-17T12:03:50.997Z %{host} %{message}" message with different timestamps.

Do we have add some tag for such requirement?

(Nishant Verma) #2

This is what I noticed in recent run. No flush size is defined in my conf file

On existing index, doc append is going very slow.
If index does not exist and logstash creates one, then doc add is very fast.

How can I increase the performance of doc append on existing index?

(Mark Walkom) #3

Can we see your config?

(Nishant Verma) #4

input {
kafka {
bootstrap_servers => ","
topics => "live2"
codec => "plain"

filter {
csv {
separator => ";"
columns => ["NAME","EMPLOYEEID","DEPT","DESIGNATION","ID","GOC","PROJ","PM","DM","TL","PFC","BS","BoA","TA","LA","DA","MA","TLA","GS"]

mutate {
remove_field => [ "message" ]

output {
elasticsearch { hosts => [""]
index => "live"
stdout {}

(system) #5

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.