ELK Searches from Splunk


On a single server I have ELK(v 7.6.0) and Splunk.

All sources that support syslog protocol are being ingested to ELK

Taking advantage of some Splunk functionalities a query is made to ELK with this kind of code (| ess eaddr="http://localhost:9200" index=paloalto* tsfield="@timestamp" query="SourceIP:" fields="*" )

Depending on the amount of matches that the query brings depends on the time in which the information is displayed and I guess it is something for everyone and even obvious.

I have increased the server capabilities to try to improve this performance but as inexperienced I would like to know if you who have more experience than me have knowledge if there is any limitation on the number of events that can be queried from splunk to ELK.

at the moment I have this error message and since it is on the Splunk plugin side I guess I have to investigate how to increase that 60 seconds time that is configured by default.

External search command 'ess' returned error code 1. Script output = "error_message=ConnectionTimeout at "/var2/splunk/splunk/etc/apps/elasticsplunk/bin/elasticsearch/connection/http_urllib3.py", line 155 : ConnectionTimeout caused by - ReadTimeoutError(HTTPConnectionPool(host=u'localhost', port=9200): Read timed out. (read timeout=60)) ".”

Sounds like the issue might be on Splunk's side. I know of no specific rate limit setting on the elasticsearch side. It would be a good idea to make sure your query doesn't take too long to execute though. You can turn on slow logs on the index. Also, take a look at your cluster from Stack Monitoring to make sure it isn't hitting a bottleneck. Some more details on what is actually happening on the elastic side would be helpful, but these are some general suggestions when a performance question arises.

Please note that version is well past EOL and no longer supported, you should be looking to upgrade as a matter of urgency.

7.6.0 is EOL and no longer supported. Please upgrade ASAP.

(This is an automated response from your friendly Elastic bot. Please report this post if you have any suggestions or concerns :elasticheart: )