Logstash filter elasticsearch open connections issue


(Mahesh) #1

Hello,

Currently we are trying to send data from kafka to elasticsearch and in between we used logstash filter elasticsearch plugin to query for previous documents timestamp and other fields, at that time open connections between logstash and elasticsearch are very high. Due to open connections some threads are waiting and then logstash was crashed.

Is there any way to reduce the open connections between logstash and elasticsearch in logstash filter elasticsearch plugin?

Ref:
Kafka QPS(Queries per second): 50000
Conf in logstash filter elasticsearch plugin:
input {
kafka {
}
}
filter {
elasticsearch {
hosts => ["localhost:9200"]
index => "test_index*"
enable_sort => false
query => "type:type%{[oi]} AND _id:%{[d]}-%{[ai]}"
fields => [["ts","stime"],["ml","mlarray"],["dl","dlarray"],["wl","wlarray"]]
}
}
output {
elasticsearch {
}
}

Please help me.


(Mahesh) #2

Hi Elastic team,

Could you help us for the above issue? or any advise would be nice.

Is there any way to reduce the open connections between logstash and elasticsearch in logstash filter elasticsearch plugin?


(Christian Dahlqvist) #3

This forum is manned by volunteers, so please be patient.


(Mahesh) #4

Ok thank you @Christian_Dahlqvist . I look forward to hearing good solution/advice from your team.


(Christian Dahlqvist) #5

Which version of Logstash are you using?

How many open connections are you seeing?

How many worker threads is Logstash using?


(Mark Walkom) #6

Please show the logs for the time that this issue happened.


(Mahesh) #7

@Christian_Dahlqvist Thank you for quick response. Please find the details.

Which version of Logstash are you using?

logstash 5.4.3

How many open connections are you seeing?

(~17000 Connections)

How many worker threads is Logstash using?

Configurations are default. We did POC on 2 core machine and the pipeline workers by default "2".

Problem:
Open connections between logstash and elasticsearch are high, while querying for previous documents fields by using logstash filter elasticsearch plugin in logstash. And also there are so many TIME_WAIT connections on logstash server.

Note:
No issue when we send events directly from kafka to elasticsearch by using logstash without elasticsearch filter plugin.


(Mahesh) #8

@warkolm there are no errors and warnings in logstash logs

[2017-09-19T02:59:55,993][INFO ][logstash.pipeline ] Pipeline main started
[2017-09-19T02:59:56,095][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}


(Mark Walkom) #9

Then how do you know it crashed?


(Mahesh) #10

@warkolm Kafka consumer lag is increasing for that logstash consumer, and when I checked consumer threads by using below command it was not showing any consumer logstash threads and it's not consuming any events.

bin/kafka-consumer-groups.sh --bootstrap-server localhost:9092 --describe --group ${LOGSTASH_GROUP}


(system) #11

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.