Hello,
I seem to be getting some hanging when indexing documents. I'm still
playing around to see if I can pin down exactly what is causing the
issue. Also, I am not sure if this is an ES issue, an issue with the
HTTP objects I am using to connect or most likely an OS network/socket
issue, as my indexer is running on Windows, submitting files to ES
running on linux. I've had my fair share of windows socket issues in
the past, caused by horrible default socket connections for the number
of ephemeral ports.
I am on master from last night.
I am using the python HTTPConnection object via pyelasticsearch to
index. When everything starts up indexing churns away for a while as
it should, but eventually my connection hangs and when this occurs I
see the exception below. I've hit this using a single index thread and
multiple (well, in my case it is actually seperate indexer process to
get around the GIL, but should be the same from ES perspective). This
is the same exception that I receive when I kill my indexer while
there is an open socket performing an index. Does the exception below
mean that the socket was closed by the client side before the data was
received or while it was flowing over the socket?
Any other thoughts or ideas would be much appreciated.
Thanks,
Paul
[18:01:47,706][INFO ][node ] [Jackhammer]
{elasticsearch/0.9.1-SNAPSHOT/2010-08-04T06:38:28}[7850]: started
[18:17:18,553][WARN ][http.netty ] [Jackhammer] Caught
exception while handling client http traffic
java.lang.IllegalArgumentException: empty text
at
org.elasticsearch.common.netty.handler.codec.http.HttpVersion.(HttpVersion.java:
103)
at
org.elasticsearch.common.netty.handler.codec.http.HttpVersion.valueOf(HttpVersion.java:
68)
at
org.elasticsearch.common.netty.handler.codec.http.HttpRequestDecoder.createMessage(HttpRequestDecoder.java:
81)
at
org.elasticsearch.common.netty.handler.codec.http.HttpMessageDecoder.decode(HttpMessageDecoder.java:
198)
at
org.elasticsearch.common.netty.handler.codec.http.HttpMessageDecoder.decode(HttpMessageDecoder.java:
107)
at
org.elasticsearch.common.netty.handler.codec.replay.ReplayingDecoder.callDecode(ReplayingDecoder.java:
461)
at
org.elasticsearch.common.netty.handler.codec.replay.ReplayingDecoder.messageReceived(ReplayingDecoder.java:
434)
at
org.elasticsearch.common.netty.channel.SimpleChannelUpstreamHandler.handleUpstream(SimpleChannelUpstreamHandler.java:
80)
at
org.elasticsearch.common.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:
545)
at
org.elasticsearch.common.netty.channel.DefaultChannelPipeline
$DefaultChannelHandlerContext.sendUpstream(DefaultChannelPipeline.java:
754)
at
org.elasticsearch.common.netty.OpenChannelsHandler.handleUpstream(OpenChannelsHandler.java:
51)
at
org.elasticsearch.common.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:
545)
at
org.elasticsearch.common.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:
540)
at
org.elasticsearch.common.netty.channel.Channels.fireMessageReceived(Channels.java:
274)
at
org.elasticsearch.common.netty.channel.Channels.fireMessageReceived(Channels.java:
261)
at
org.elasticsearch.common.netty.channel.socket.nio.NioWorker.read(NioWorker.java:
349)
at
org.elasticsearch.common.netty.channel.socket.nio.NioWorker.processSelectedKeys(NioWorker.java:
281)
at
org.elasticsearch.common.netty.channel.socket.nio.NioWorker.run(NioWorker.java:
201)
at
org.elasticsearch.common.netty.util.ThreadRenamingRunnable.run(ThreadRenamingRunnable.java:
108)
at
org.elasticsearch.common.netty.util.internal.IoWorkerRunnable.run(IoWorkerRunnable.java:
46)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:
1110)
at java.util.concurrent.ThreadPoolExecutor
$Worker.run(ThreadPoolExecutor.java:603)
at java.lang.Thread.run(Thread.java:636)