"Could not fetch URL" while using http output plugin


(rajesh) #1

Hi..

I am using http output plugin in logstash and the error am getting is,

 ERROR logstash.outputs.http - [HTTP Output Failure] Could not fetch URL {:url=>"http://xxx.xx.xx:xxx"
name=memoryStats.numBytesAllocated origin=p-redis unit=count value=671040 \",\"type\":\"syslog\",\"tags\":[\"_grokparsefailure\"]}", :headers=>{"Content-Type"=>"application/json"}, :message=>"Read timed out", :class=>"Manticore::SocketTimeout", :backtrace=>nil, :will_retry=>true}
14:53:13.875 [pool-684-thread-1] INFO  logstash.outputs.http - Retrying http request, will sleep for 0 seconds

And the weird thing is, it happens only sometimes. Can someone shed light on this?

Thanks in advance..


(rajesh) #2

Hi, Can someone please help on this issue?


(Guy Boertje) #3

The error is not fatal because will_retry is set to true. So, occasionally, the request to your endpoint times out but it tries again and succeeds.

You should monitor this and examine the network and endpoint responsiveness if it happens too often for comfort.


(rajesh) #4

Thanks Guy for you reply.

It happens too often and after some time I don't see this issue. Is it a problem with Logstash HTTP plugin or my network? If it retries, there would definitely be a lag for indexing. And also can you please confirm there wouldn't be any data loss? What would be the alternative for this? Please help me with all the questions I have.

Thank you


(Guy Boertje) #5

The output tells what the lag might be. The code uses a power 2 curve to determine the sleep time to a max of 60 seconds.

Its probably your network.

There may be data loss. You should test. Run your http endpoint with a file input that reads a large number of lines, say, 10 or 100 million. While LS is running, pause the http endpoint in some way or maybe use a simple http db service like riak. Count the documents in riak.


(system) #6

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.