Cant append documents on existing index

Hello

We are using logstash to sink kafka data to elastic search.

When we are trying to insert docs on a non-existing index on elastic search, then logstash creates the index and starts docs there.
But when we try to do the same on an already existing index, then nothing happens. No error on logstash console. Keeps on sending "2017-05-17T12:03:50.997Z %{host} %{message}" message with different timestamps.

Do we have add some tag for such requirement?

This is what I noticed in recent run. No flush size is defined in my conf file

On existing index, doc append is going very slow.
If index does not exist and logstash creates one, then doc add is very fast.

How can I increase the performance of doc append on existing index?

Can we see your config?

input {
kafka {
bootstrap_servers => "1.2.4.14:9092,1.2.4.18:9092"
topics => "live2"
codec => "plain"
}
}

filter {
csv {
separator => ";"
columns => ["NAME","EMPLOYEEID","DEPT","DESIGNATION","ID","GOC","PROJ","PM","DM","TL","PFC","BS","BoA","TA","LA","DA","MA","TLA","GS"]
}

mutate {
remove_field => [ "message" ]
}
}

output {
elasticsearch { hosts => ["http://1.2.4.13:9200"]
index => "live"
}
stdout {}
}

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.