I've removed the old index and created a new one with mapping predefined.
Last time 760 710 643 went through logstash without no problems. This time I've a predefined mapping and replaced kv filter with custom written filter in ruby, because kv was not working as it should.
Afer 81 711 586 I'm on a standstill. I'm getting this error:
2017-09-07T08:37:04+02:00 ERR Failed to publish events caused by: read tcp 127.0.0.1:49569->127.0.0.1:5044: i/o timeout
2017-09-07T08:37:04+02:00 INFO Error publishing events (retrying): read tcp 127.0.0.1:49569->127.0.0.1:5044: i/o timeout
2017-09-07T08:37:33+02:00 INFO Non-zero metrics in the last 30s: libbeat.logstash.publish.read_errors=1 libbeat.logstash.published_but_not_acked_events=1940
2017-09-07T08:38:03+02:00 INFO Non-zero metrics in the last 30s: libbeat.logstash.call_count.PublishEvents=1 libbeat.logstash.publish.write_bytes=550
As it was recomended here: ERR Failed to publish events caused by: read tcp IP:40634->IP:5044: i/o timeout I've raised timeout to 60s and it still didn't work.