NPE in SignificanceHeuristicStreams.read while deserializing response (SearchResponse)

Hey guys, I’m getting NullPointerException while using a significant_terms
aggregation. It happens in this line:

org.elasticsearch.search.aggregations.bucket.significant.heuristics.SignificanceHeuristicStreams.read(SignificanceHeuristicStreams.java:38)

The error is in the deserialization: Failed to deserialize response of
type [org.elasticsearch.action.search.SearchResponse]

I’m using the Java API. I just printed the request and manually did it
through the REST API and everything went fine. It happens only when using
the Java API.

I'm using ES 1.3.2.

The printed search request:

{
"from" : 0,
"size" : 6,
"timeout" : 30000,
"query" : {
"filtered" : {
"query" : {
"query_string" : {
"query" : "ayrton senna",
"fields" : [ "title^2.0", "description" ],
"default_operator" : "and"
}
},
"filter" : {
"bool" : {
"must" : [ {
"range" : {
"created_at" : {
"from" : null,
"to" : "2014-09-19T20:28:30.000Z",
"include_lower" : true,
"include_upper" : true
},
"_cache" : true
}
}, {
"range" : {
"published_at" : {
"from" : null,
"to" : "2014-09-19T20:28:30.000Z",
"include_lower" : true,
"include_upper" : true
},
"_cache" : true
}
}, {
"range" : {
"published_at" : {
"from" : "2014-08-20T20:28:30.000Z",
"to" : "2014-09-19T20:28:30.000Z",
"include_lower" : true,
"include_upper" : true
},
"_cache" : false
}
} ]
}
}
}
},
"fields" : [ ],
"aggregations" : {
"topics" : {
"significant_terms" : {
"field" : "topic_ids",
"size" : 20
}
}
}
}

The complete error stacktrace:

[ERROR] 2014-09-19 20:29:13.177 c.b.s.SearchServlet -
org.elasticsearch.transport.TransportSerializationException: Failed to
deserialize response of type
[org.elasticsearch.action.search.SearchResponse]
org.elasticsearch.transport.TransportSerializationException: Failed to
deserialize response of type
[org.elasticsearch.action.search.SearchResponse]
at
org.elasticsearch.transport.netty.MessageChannelHandler.handleResponse(MessageChannelHandler.java:152)
at
org.elasticsearch.transport.netty.MessageChannelHandler.messageReceived(MessageChannelHandler.java:127)
at
org.elasticsearch.common.netty.channel.SimpleChannelUpstreamHandler.handleUpstream(SimpleChannelUpstreamHandler.java:70)
at
org.elasticsearch.common.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564)
at
org.elasticsearch.common.netty.channel.DefaultChannelPipeline$DefaultChannelHandlerContext.sendUpstream(DefaultChannelPipeline.java:791)
at
org.elasticsearch.common.netty.channel.Channels.fireMessageReceived(Channels.java:296)
at
org.elasticsearch.common.netty.handler.codec.frame.FrameDecoder.unfoldAndFireMessageReceived(FrameDecoder.java:462)
at
org.elasticsearch.common.netty.handler.codec.frame.FrameDecoder.callDecode(FrameDecoder.java:443)
at
org.elasticsearch.common.netty.handler.codec.frame.FrameDecoder.messageReceived(FrameDecoder.java:303)
at
org.elasticsearch.common.netty.channel.SimpleChannelUpstreamHandler.handleUpstream(SimpleChannelUpstreamHandler.java:70)
at
org.elasticsearch.common.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564)
at
org.elasticsearch.common.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:559)
at
org.elasticsearch.common.netty.channel.Channels.fireMessageReceived(Channels.java:268)
at
org.elasticsearch.common.netty.channel.Channels.fireMessageReceived(Channels.java:255)
at
org.elasticsearch.common.netty.channel.socket.nio.NioWorker.read(NioWorker.java:88)
at
org.elasticsearch.common.netty.channel.socket.nio.AbstractNioWorker.process(AbstractNioWorker.java:108)
at
org.elasticsearch.common.netty.channel.socket.nio.AbstractNioSelector.run(AbstractNioSelector.java:318)
at
org.elasticsearch.common.netty.channel.socket.nio.AbstractNioWorker.run(AbstractNioWorker.java:89)
at
org.elasticsearch.common.netty.channel.socket.nio.NioWorker.run(NioWorker.java:178)
at
org.elasticsearch.common.netty.util.ThreadRenamingRunnable.run(ThreadRenamingRunnable.java:108)
at
org.elasticsearch.common.netty.util.internal.DeadLockProofWorker$1.run(DeadLockProofWorker.java:42)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.NullPointerException
at
org.elasticsearch.search.aggregations.bucket.significant.heuristics.SignificanceHeuristicStreams.read(SignificanceHeuristicStreams.java:38)
at
org.elasticsearch.search.aggregations.bucket.significant.SignificantLongTerms.readFrom(SignificantLongTerms.java:126)
at
org.elasticsearch.search.aggregations.bucket.significant.SignificantLongTerms$1.readResult(SignificantLongTerms.java:50)
at
org.elasticsearch.search.aggregations.bucket.significant.SignificantLongTerms$1.readResult(SignificantLongTerms.java:46)
at
org.elasticsearch.search.aggregations.InternalAggregations.readFrom(InternalAggregations.java:190)
at
org.elasticsearch.search.aggregations.InternalAggregations.readAggregations(InternalAggregations.java:172)
at
org.elasticsearch.search.internal.InternalSearchResponse.readFrom(InternalSearchResponse.java:116)
at
org.elasticsearch.search.internal.InternalSearchResponse.readInternalSearchResponse(InternalSearchResponse.java:105)
at
org.elasticsearch.action.search.SearchResponse.readFrom(SearchResponse.java:227)
at
org.elasticsearch.transport.netty.MessageChannelHandler.handleResponse(MessageChannelHandler.java:150)
... 23 more

The (Scala) code I used to generate the request :

  val request = ....

val topicsAggregation =
significantTerms("topics").field("topic_ids").size(20)
request.addAggregation(topicsAggregation)

The code to retrieve the aggregation (although it seems it never reaches
here):

val terms: SignificantTerms = response.getAggregations.get("topics")

Any ideas?

Thanks!

Felipe Hummel

--
You received this message because you are subscribed to the Google Groups "elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email to elasticsearch+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/5a511b6b-a9bc-47dd-b3bd-84f9a7ac5a7f%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

I missed a part of the error message:

[WARN] 2014-09-19 20:29:13.176 o.e.t.netty - [Sigyn] Message not fully read

(response) for [61] handler
org.elasticsearch.action.TransportActionNodeProxy$1@2e6201d0, error
[false], resetting

On Friday, September 19, 2014 5:58:15 PM UTC-3, Felipe Hummel wrote:

Hey guys, I’m getting NullPointerException while using a
significant_terms aggregation. It happens in this line:

org.elasticsearch.search.aggregations.bucket.significant.heuristics.SignificanceHeuristicStreams.read(SignificanceHeuristicStreams.java:38)

The error is in the deserialization: Failed to deserialize response of
type [org.elasticsearch.action.search.SearchResponse]

I’m using the Java API. I just printed the request and manually did it
through the REST API and everything went fine. It happens only when using
the Java API.

I'm using ES 1.3.2.

The printed search request:

{
"from" : 0,
"size" : 6,
"timeout" : 30000,
"query" : {
"filtered" : {
"query" : {
"query_string" : {
"query" : "ayrton senna",
"fields" : [ "title^2.0", "description" ],
"default_operator" : "and"
}
},
"filter" : {
"bool" : {
"must" : [ {
"range" : {
"created_at" : {
"from" : null,
"to" : "2014-09-19T20:28:30.000Z",
"include_lower" : true,
"include_upper" : true
},
"_cache" : true
}
}, {
"range" : {
"published_at" : {
"from" : null,
"to" : "2014-09-19T20:28:30.000Z",
"include_lower" : true,
"include_upper" : true
},
"_cache" : true
}
}, {
"range" : {
"published_at" : {
"from" : "2014-08-20T20:28:30.000Z",
"to" : "2014-09-19T20:28:30.000Z",
"include_lower" : true,
"include_upper" : true
},
"_cache" : false
}
} ]
}
}
}
},
"fields" : [ ],
"aggregations" : {
"topics" : {
"significant_terms" : {
"field" : "topic_ids",
"size" : 20
}
}
}
}

The complete error stacktrace:

[ERROR] 2014-09-19 20:29:13.177 c.b.s.SearchServlet -
org.elasticsearch.transport.TransportSerializationException: Failed to
deserialize response of type
[org.elasticsearch.action.search.SearchResponse]
org.elasticsearch.transport.TransportSerializationException: Failed to
deserialize response of type
[org.elasticsearch.action.search.SearchResponse]
at
org.elasticsearch.transport.netty.MessageChannelHandler.handleResponse(MessageChannelHandler.java:152)
at
org.elasticsearch.transport.netty.MessageChannelHandler.messageReceived(MessageChannelHandler.java:127)
at
org.elasticsearch.common.netty.channel.SimpleChannelUpstreamHandler.handleUpstream(SimpleChannelUpstreamHandler.java:70)
at
org.elasticsearch.common.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564)
at
org.elasticsearch.common.netty.channel.DefaultChannelPipeline$DefaultChannelHandlerContext.sendUpstream(DefaultChannelPipeline.java:791)
at
org.elasticsearch.common.netty.channel.Channels.fireMessageReceived(Channels.java:296)
at
org.elasticsearch.common.netty.handler.codec.frame.FrameDecoder.unfoldAndFireMessageReceived(FrameDecoder.java:462)
at
org.elasticsearch.common.netty.handler.codec.frame.FrameDecoder.callDecode(FrameDecoder.java:443)
at
org.elasticsearch.common.netty.handler.codec.frame.FrameDecoder.messageReceived(FrameDecoder.java:303)
at
org.elasticsearch.common.netty.channel.SimpleChannelUpstreamHandler.handleUpstream(SimpleChannelUpstreamHandler.java:70)
at
org.elasticsearch.common.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564)
at
org.elasticsearch.common.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:559)
at
org.elasticsearch.common.netty.channel.Channels.fireMessageReceived(Channels.java:268)
at
org.elasticsearch.common.netty.channel.Channels.fireMessageReceived(Channels.java:255)
at
org.elasticsearch.common.netty.channel.socket.nio.NioWorker.read(NioWorker.java:88)
at
org.elasticsearch.common.netty.channel.socket.nio.AbstractNioWorker.process(AbstractNioWorker.java:108)
at
org.elasticsearch.common.netty.channel.socket.nio.AbstractNioSelector.run(AbstractNioSelector.java:318)
at
org.elasticsearch.common.netty.channel.socket.nio.AbstractNioWorker.run(AbstractNioWorker.java:89)
at
org.elasticsearch.common.netty.channel.socket.nio.NioWorker.run(NioWorker.java:178)
at
org.elasticsearch.common.netty.util.ThreadRenamingRunnable.run(ThreadRenamingRunnable.java:108)
at
org.elasticsearch.common.netty.util.internal.DeadLockProofWorker$1.run(DeadLockProofWorker.java:42)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.NullPointerException
at
org.elasticsearch.search.aggregations.bucket.significant.heuristics.SignificanceHeuristicStreams.read(SignificanceHeuristicStreams.java:38)
at
org.elasticsearch.search.aggregations.bucket.significant.SignificantLongTerms.readFrom(SignificantLongTerms.java:126)
at
org.elasticsearch.search.aggregations.bucket.significant.SignificantLongTerms$1.readResult(SignificantLongTerms.java:50)
at
org.elasticsearch.search.aggregations.bucket.significant.SignificantLongTerms$1.readResult(SignificantLongTerms.java:46)
at
org.elasticsearch.search.aggregations.InternalAggregations.readFrom(InternalAggregations.java:190)
at
org.elasticsearch.search.aggregations.InternalAggregations.readAggregations(InternalAggregations.java:172)
at
org.elasticsearch.search.internal.InternalSearchResponse.readFrom(InternalSearchResponse.java:116)
at
org.elasticsearch.search.internal.InternalSearchResponse.readInternalSearchResponse(InternalSearchResponse.java:105)
at
org.elasticsearch.action.search.SearchResponse.readFrom(SearchResponse.java:227)
at
org.elasticsearch.transport.netty.MessageChannelHandler.handleResponse(MessageChannelHandler.java:150)
... 23 more

The (Scala) code I used to generate the request :

  val request = ....

val topicsAggregation =
significantTerms("topics").field("topic_ids").size(20)
request.addAggregation(topicsAggregation)

The code to retrieve the aggregation (although it seems it never reaches
here):

val terms: SignificantTerms = response.getAggregations.get("topics")

Any ideas?

Thanks!

Felipe Hummel

--
You received this message because you are subscribed to the Google Groups "elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email to elasticsearch+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/d5c5a662-dbb5-4e5c-8f68-760c4e84a69d%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

More information. All 5 ES nodes are on 1.3.2 (checked with curl
localhost:9200/) with java 1.7.0_65. Client machine is also on 1.3.2
and 1.7.0_65

On Friday, September 19, 2014 6:21:06 PM UTC-3, Felipe Hummel wrote:

I missed a part of the error message:

[WARN] 2014-09-19 20:29:13.176 o.e.t.netty - [Sigyn] Message not fully

read (response) for [61] handler
org.elasticsearch.action.TransportActionNodeProxy$1@2e6201d0, error
[false], resetting

On Friday, September 19, 2014 5:58:15 PM UTC-3, Felipe Hummel wrote:

Hey guys, I’m getting NullPointerException while using a
significant_terms aggregation. It happens in this line:

org.elasticsearch.search.aggregations.bucket.significant.heuristics.SignificanceHeuristicStreams.read(SignificanceHeuristicStreams.java:38)

The error is in the deserialization: Failed to deserialize response of
type [org.elasticsearch.action.search.SearchResponse]

I’m using the Java API. I just printed the request and manually did it
through the REST API and everything went fine. It happens only when using
the Java API.

I'm using ES 1.3.2.

The printed search request:

{
"from" : 0,
"size" : 6,
"timeout" : 30000,
"query" : {
"filtered" : {
"query" : {
"query_string" : {
"query" : "ayrton senna",
"fields" : [ "title^2.0", "description" ],
"default_operator" : "and"
}
},
"filter" : {
"bool" : {
"must" : [ {
"range" : {
"created_at" : {
"from" : null,
"to" : "2014-09-19T20:28:30.000Z",
"include_lower" : true,
"include_upper" : true
},
"_cache" : true
}
}, {
"range" : {
"published_at" : {
"from" : null,
"to" : "2014-09-19T20:28:30.000Z",
"include_lower" : true,
"include_upper" : true
},
"_cache" : true
}
}, {
"range" : {
"published_at" : {
"from" : "2014-08-20T20:28:30.000Z",
"to" : "2014-09-19T20:28:30.000Z",
"include_lower" : true,
"include_upper" : true
},
"_cache" : false
}
} ]
}
}
}
},
"fields" : [ ],
"aggregations" : {
"topics" : {
"significant_terms" : {
"field" : "topic_ids",
"size" : 20
}
}
}
}

The complete error stacktrace:

[ERROR] 2014-09-19 20:29:13.177 c.b.s.SearchServlet -
org.elasticsearch.transport.TransportSerializationException: Failed to
deserialize response of type
[org.elasticsearch.action.search.SearchResponse]
org.elasticsearch.transport.TransportSerializationException: Failed to
deserialize response of type
[org.elasticsearch.action.search.SearchResponse]
at
org.elasticsearch.transport.netty.MessageChannelHandler.handleResponse(MessageChannelHandler.java:152)
at
org.elasticsearch.transport.netty.MessageChannelHandler.messageReceived(MessageChannelHandler.java:127)
at
org.elasticsearch.common.netty.channel.SimpleChannelUpstreamHandler.handleUpstream(SimpleChannelUpstreamHandler.java:70)
at
org.elasticsearch.common.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564)
at
org.elasticsearch.common.netty.channel.DefaultChannelPipeline$DefaultChannelHandlerContext.sendUpstream(DefaultChannelPipeline.java:791)
at
org.elasticsearch.common.netty.channel.Channels.fireMessageReceived(Channels.java:296)
at
org.elasticsearch.common.netty.handler.codec.frame.FrameDecoder.unfoldAndFireMessageReceived(FrameDecoder.java:462)
at
org.elasticsearch.common.netty.handler.codec.frame.FrameDecoder.callDecode(FrameDecoder.java:443)
at
org.elasticsearch.common.netty.handler.codec.frame.FrameDecoder.messageReceived(FrameDecoder.java:303)
at
org.elasticsearch.common.netty.channel.SimpleChannelUpstreamHandler.handleUpstream(SimpleChannelUpstreamHandler.java:70)
at
org.elasticsearch.common.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564)
at
org.elasticsearch.common.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:559)
at
org.elasticsearch.common.netty.channel.Channels.fireMessageReceived(Channels.java:268)
at
org.elasticsearch.common.netty.channel.Channels.fireMessageReceived(Channels.java:255)
at
org.elasticsearch.common.netty.channel.socket.nio.NioWorker.read(NioWorker.java:88)
at
org.elasticsearch.common.netty.channel.socket.nio.AbstractNioWorker.process(AbstractNioWorker.java:108)
at
org.elasticsearch.common.netty.channel.socket.nio.AbstractNioSelector.run(AbstractNioSelector.java:318)
at
org.elasticsearch.common.netty.channel.socket.nio.AbstractNioWorker.run(AbstractNioWorker.java:89)
at
org.elasticsearch.common.netty.channel.socket.nio.NioWorker.run(NioWorker.java:178)
at
org.elasticsearch.common.netty.util.ThreadRenamingRunnable.run(ThreadRenamingRunnable.java:108)
at
org.elasticsearch.common.netty.util.internal.DeadLockProofWorker$1.run(DeadLockProofWorker.java:42)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.NullPointerException
at
org.elasticsearch.search.aggregations.bucket.significant.heuristics.SignificanceHeuristicStreams.read(SignificanceHeuristicStreams.java:38)
at
org.elasticsearch.search.aggregations.bucket.significant.SignificantLongTerms.readFrom(SignificantLongTerms.java:126)
at
org.elasticsearch.search.aggregations.bucket.significant.SignificantLongTerms$1.readResult(SignificantLongTerms.java:50)
at
org.elasticsearch.search.aggregations.bucket.significant.SignificantLongTerms$1.readResult(SignificantLongTerms.java:46)
at
org.elasticsearch.search.aggregations.InternalAggregations.readFrom(InternalAggregations.java:190)
at
org.elasticsearch.search.aggregations.InternalAggregations.readAggregations(InternalAggregations.java:172)
at
org.elasticsearch.search.internal.InternalSearchResponse.readFrom(InternalSearchResponse.java:116)
at
org.elasticsearch.search.internal.InternalSearchResponse.readInternalSearchResponse(InternalSearchResponse.java:105)
at
org.elasticsearch.action.search.SearchResponse.readFrom(SearchResponse.java:227)
at
org.elasticsearch.transport.netty.MessageChannelHandler.handleResponse(MessageChannelHandler.java:150)
... 23 more

The (Scala) code I used to generate the request :

  val request = ....

val topicsAggregation =
significantTerms("topics").field("topic_ids").size(20)
request.addAggregation(topicsAggregation)

The code to retrieve the aggregation (although it seems it never reaches
here):

val terms: SignificantTerms = response.getAggregations.get("topics")

Any ideas?

Thanks!

Felipe Hummel

--
You received this message because you are subscribed to the Google Groups "elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email to elasticsearch+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/c650d446-f61a-46cc-9b4a-9926a05af8e2%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

I'm facing the same Problem... I tried the Aggregations using the REST-API
and everything went fine.

It looks like the *registerStream *Method in the
SignificanceHeuristicStreams Class gets never called.
Therefore, the STREAMS-List remains empty...

Cheers,
Michael

Am Freitag, 19. September 2014 22:58:15 UTC+2 schrieb Felipe Hummel:

Hey guys, I’m getting NullPointerException while using a
significant_terms aggregation. It happens in this line:

org.elasticsearch.search.aggregations.bucket.significant.heuristics.SignificanceHeuristicStreams.read(SignificanceHeuristicStreams.java:38)

The error is in the deserialization: Failed to deserialize response of
type [org.elasticsearch.action.search.SearchResponse]

I’m using the Java API. I just printed the request and manually did it
through the REST API and everything went fine. It happens only when using
the Java API.

I'm using ES 1.3.2.

The printed search request:

{
"from" : 0,
"size" : 6,
"timeout" : 30000,
"query" : {
"filtered" : {
"query" : {
"query_string" : {
"query" : "ayrton senna",
"fields" : [ "title^2.0", "description" ],
"default_operator" : "and"
}
},
"filter" : {
"bool" : {
"must" : [ {
"range" : {
"created_at" : {
"from" : null,
"to" : "2014-09-19T20:28:30.000Z",
"include_lower" : true,
"include_upper" : true
},
"_cache" : true
}
}, {
"range" : {
"published_at" : {
"from" : null,
"to" : "2014-09-19T20:28:30.000Z",
"include_lower" : true,
"include_upper" : true
},
"_cache" : true
}
}, {
"range" : {
"published_at" : {
"from" : "2014-08-20T20:28:30.000Z",
"to" : "2014-09-19T20:28:30.000Z",
"include_lower" : true,
"include_upper" : true
},
"_cache" : false
}
} ]
}
}
}
},
"fields" : [ ],
"aggregations" : {
"topics" : {
"significant_terms" : {
"field" : "topic_ids",
"size" : 20
}
}
}
}

The complete error stacktrace:

[ERROR] 2014-09-19 20:29:13.177 c.b.s.SearchServlet -
org.elasticsearch.transport.TransportSerializationException: Failed to
deserialize response of type
[org.elasticsearch.action.search.SearchResponse]
org.elasticsearch.transport.TransportSerializationException: Failed to
deserialize response of type
[org.elasticsearch.action.search.SearchResponse]
at
org.elasticsearch.transport.netty.MessageChannelHandler.handleResponse(MessageChannelHandler.java:152)
at
org.elasticsearch.transport.netty.MessageChannelHandler.messageReceived(MessageChannelHandler.java:127)
at
org.elasticsearch.common.netty.channel.SimpleChannelUpstreamHandler.handleUpstream(SimpleChannelUpstreamHandler.java:70)
at
org.elasticsearch.common.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564)
at
org.elasticsearch.common.netty.channel.DefaultChannelPipeline$DefaultChannelHandlerContext.sendUpstream(DefaultChannelPipeline.java:791)
at
org.elasticsearch.common.netty.channel.Channels.fireMessageReceived(Channels.java:296)
at
org.elasticsearch.common.netty.handler.codec.frame.FrameDecoder.unfoldAndFireMessageReceived(FrameDecoder.java:462)
at
org.elasticsearch.common.netty.handler.codec.frame.FrameDecoder.callDecode(FrameDecoder.java:443)
at
org.elasticsearch.common.netty.handler.codec.frame.FrameDecoder.messageReceived(FrameDecoder.java:303)
at
org.elasticsearch.common.netty.channel.SimpleChannelUpstreamHandler.handleUpstream(SimpleChannelUpstreamHandler.java:70)
at
org.elasticsearch.common.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564)
at
org.elasticsearch.common.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:559)
at
org.elasticsearch.common.netty.channel.Channels.fireMessageReceived(Channels.java:268)
at
org.elasticsearch.common.netty.channel.Channels.fireMessageReceived(Channels.java:255)
at
org.elasticsearch.common.netty.channel.socket.nio.NioWorker.read(NioWorker.java:88)
at
org.elasticsearch.common.netty.channel.socket.nio.AbstractNioWorker.process(AbstractNioWorker.java:108)
at
org.elasticsearch.common.netty.channel.socket.nio.AbstractNioSelector.run(AbstractNioSelector.java:318)
at
org.elasticsearch.common.netty.channel.socket.nio.AbstractNioWorker.run(AbstractNioWorker.java:89)
at
org.elasticsearch.common.netty.channel.socket.nio.NioWorker.run(NioWorker.java:178)
at
org.elasticsearch.common.netty.util.ThreadRenamingRunnable.run(ThreadRenamingRunnable.java:108)
at
org.elasticsearch.common.netty.util.internal.DeadLockProofWorker$1.run(DeadLockProofWorker.java:42)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.NullPointerException
at
org.elasticsearch.search.aggregations.bucket.significant.heuristics.SignificanceHeuristicStreams.read(SignificanceHeuristicStreams.java:38)
at
org.elasticsearch.search.aggregations.bucket.significant.SignificantLongTerms.readFrom(SignificantLongTerms.java:126)
at
org.elasticsearch.search.aggregations.bucket.significant.SignificantLongTerms$1.readResult(SignificantLongTerms.java:50)
at
org.elasticsearch.search.aggregations.bucket.significant.SignificantLongTerms$1.readResult(SignificantLongTerms.java:46)
at
org.elasticsearch.search.aggregations.InternalAggregations.readFrom(InternalAggregations.java:190)
at
org.elasticsearch.search.aggregations.InternalAggregations.readAggregations(InternalAggregations.java:172)
at
org.elasticsearch.search.internal.InternalSearchResponse.readFrom(InternalSearchResponse.java:116)
at
org.elasticsearch.search.internal.InternalSearchResponse.readInternalSearchResponse(InternalSearchResponse.java:105)
at
org.elasticsearch.action.search.SearchResponse.readFrom(SearchResponse.java:227)
at
org.elasticsearch.transport.netty.MessageChannelHandler.handleResponse(MessageChannelHandler.java:150)
... 23 more

The (Scala) code I used to generate the request :

  val request = ....

val topicsAggregation =
significantTerms("topics").field("topic_ids").size(20)
request.addAggregation(topicsAggregation)

The code to retrieve the aggregation (although it seems it never reaches
here):

val terms: SignificantTerms = response.getAggregations.get("topics")

Any ideas?

Thanks!

Felipe Hummel

--
You received this message because you are subscribed to the Google Groups "elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email to elasticsearch+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/c3692b99-3dc0-4e4f-a390-618d1eeef541%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Maybe it's a bug? Anyone from ES has an idea?

On Saturday, September 20, 2014 10:11:44 AM UTC-3, m.sto...@gini.net wrote:

I'm facing the same Problem... I tried the Aggregations using the REST-API
and everything went fine.

It looks like the *registerStream *Method in the
SignificanceHeuristicStreams Class gets never called.
Therefore, the STREAMS-List remains empty...

Cheers,
Michael

Am Freitag, 19. September 2014 22:58:15 UTC+2 schrieb Felipe Hummel:

Hey guys, I’m getting NullPointerException while using a
significant_terms aggregation. It happens in this line:

org.elasticsearch.search.aggregations.bucket.significant.heuristics.SignificanceHeuristicStreams.read(SignificanceHeuristicStreams.java:38)

The error is in the deserialization: Failed to deserialize response of
type [org.elasticsearch.action.search.SearchResponse]

I’m using the Java API. I just printed the request and manually did it
through the REST API and everything went fine. It happens only when using
the Java API.

I'm using ES 1.3.2.

The printed search request:

{
"from" : 0,
"size" : 6,
"timeout" : 30000,
"query" : {
"filtered" : {
"query" : {
"query_string" : {
"query" : "ayrton senna",
"fields" : [ "title^2.0", "description" ],
"default_operator" : "and"
}
},
"filter" : {
"bool" : {
"must" : [ {
"range" : {
"created_at" : {
"from" : null,
"to" : "2014-09-19T20:28:30.000Z",
"include_lower" : true,
"include_upper" : true
},
"_cache" : true
}
}, {
"range" : {
"published_at" : {
"from" : null,
"to" : "2014-09-19T20:28:30.000Z",
"include_lower" : true,
"include_upper" : true
},
"_cache" : true
}
}, {
"range" : {
"published_at" : {
"from" : "2014-08-20T20:28:30.000Z",
"to" : "2014-09-19T20:28:30.000Z",
"include_lower" : true,
"include_upper" : true
},
"_cache" : false
}
} ]
}
}
}
},
"fields" : [ ],
"aggregations" : {
"topics" : {
"significant_terms" : {
"field" : "topic_ids",
"size" : 20
}
}
}
}

The complete error stacktrace:

[ERROR] 2014-09-19 20:29:13.177 c.b.s.SearchServlet -
org.elasticsearch.transport.TransportSerializationException: Failed to
deserialize response of type
[org.elasticsearch.action.search.SearchResponse]
org.elasticsearch.transport.TransportSerializationException: Failed to
deserialize response of type
[org.elasticsearch.action.search.SearchResponse]
at
org.elasticsearch.transport.netty.MessageChannelHandler.handleResponse(MessageChannelHandler.java:152)
at
org.elasticsearch.transport.netty.MessageChannelHandler.messageReceived(MessageChannelHandler.java:127)
at
org.elasticsearch.common.netty.channel.SimpleChannelUpstreamHandler.handleUpstream(SimpleChannelUpstreamHandler.java:70)
at
org.elasticsearch.common.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564)
at
org.elasticsearch.common.netty.channel.DefaultChannelPipeline$DefaultChannelHandlerContext.sendUpstream(DefaultChannelPipeline.java:791)
at
org.elasticsearch.common.netty.channel.Channels.fireMessageReceived(Channels.java:296)
at
org.elasticsearch.common.netty.handler.codec.frame.FrameDecoder.unfoldAndFireMessageReceived(FrameDecoder.java:462)
at
org.elasticsearch.common.netty.handler.codec.frame.FrameDecoder.callDecode(FrameDecoder.java:443)
at
org.elasticsearch.common.netty.handler.codec.frame.FrameDecoder.messageReceived(FrameDecoder.java:303)
at
org.elasticsearch.common.netty.channel.SimpleChannelUpstreamHandler.handleUpstream(SimpleChannelUpstreamHandler.java:70)
at
org.elasticsearch.common.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564)
at
org.elasticsearch.common.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:559)
at
org.elasticsearch.common.netty.channel.Channels.fireMessageReceived(Channels.java:268)
at
org.elasticsearch.common.netty.channel.Channels.fireMessageReceived(Channels.java:255)
at
org.elasticsearch.common.netty.channel.socket.nio.NioWorker.read(NioWorker.java:88)
at
org.elasticsearch.common.netty.channel.socket.nio.AbstractNioWorker.process(AbstractNioWorker.java:108)
at
org.elasticsearch.common.netty.channel.socket.nio.AbstractNioSelector.run(AbstractNioSelector.java:318)
at
org.elasticsearch.common.netty.channel.socket.nio.AbstractNioWorker.run(AbstractNioWorker.java:89)
at
org.elasticsearch.common.netty.channel.socket.nio.NioWorker.run(NioWorker.java:178)
at
org.elasticsearch.common.netty.util.ThreadRenamingRunnable.run(ThreadRenamingRunnable.java:108)
at
org.elasticsearch.common.netty.util.internal.DeadLockProofWorker$1.run(DeadLockProofWorker.java:42)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.NullPointerException
at
org.elasticsearch.search.aggregations.bucket.significant.heuristics.SignificanceHeuristicStreams.read(SignificanceHeuristicStreams.java:38)
at
org.elasticsearch.search.aggregations.bucket.significant.SignificantLongTerms.readFrom(SignificantLongTerms.java:126)
at
org.elasticsearch.search.aggregations.bucket.significant.SignificantLongTerms$1.readResult(SignificantLongTerms.java:50)
at
org.elasticsearch.search.aggregations.bucket.significant.SignificantLongTerms$1.readResult(SignificantLongTerms.java:46)
at
org.elasticsearch.search.aggregations.InternalAggregations.readFrom(InternalAggregations.java:190)
at
org.elasticsearch.search.aggregations.InternalAggregations.readAggregations(InternalAggregations.java:172)
at
org.elasticsearch.search.internal.InternalSearchResponse.readFrom(InternalSearchResponse.java:116)
at
org.elasticsearch.search.internal.InternalSearchResponse.readInternalSearchResponse(InternalSearchResponse.java:105)
at
org.elasticsearch.action.search.SearchResponse.readFrom(SearchResponse.java:227)
at
org.elasticsearch.transport.netty.MessageChannelHandler.handleResponse(MessageChannelHandler.java:150)
... 23 more

The (Scala) code I used to generate the request :

  val request = ....

val topicsAggregation =
significantTerms("topics").field("topic_ids").size(20)
request.addAggregation(topicsAggregation)

The code to retrieve the aggregation (although it seems it never reaches
here):

val terms: SignificantTerms = response.getAggregations.get("topics")

Any ideas?

Thanks!

Felipe Hummel

--
You received this message because you are subscribed to the Google Groups "elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email to elasticsearch+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/9e66a54d-2b41-4d6d-b191-722e3cfb976f%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Does this problem occur with the regular Java client?

--
You received this message because you are subscribed to the Google Groups "elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email to elasticsearch+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/7d1e8b98-1722-474b-9c9f-ff3d4ac2bf14%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Yes, I was using the Java Client of Version 1.3.2.

Am Montag, 22. September 2014 12:29:06 UTC+2 schrieb Mark Harwood:

Does this problem occur with the regular Java client?

--
You received this message because you are subscribed to the Google Groups "elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email to elasticsearch+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/72aca006-49f3-4f53-a294-ccf58ec96118%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

..from a Scala app?

The current line of enquiry being that there's something about the way
Scala env behaves that might change the way dependency injection system we
rely on in our Java client is working.

--
You received this message because you are subscribed to the Google Groups "elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email to elasticsearch+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/ed1d192c-2b2a-40e6-a225-f671b5468fa8%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Hi Mark, I've created isolated "mains" for Java and Scala:

Both give the same exactly errors and stacktraces (also in the gist) when
using my indices.
So it can't be a Scala (using the Java Client) problem.

Both are run by SBT (with run-main) but it shouldn't matter, at the end SBT
just calls "java -cp ...."

On Monday, September 22, 2014 8:16:08 AM UTC-3, Mark Harwood wrote:

..from a Scala app?

The current line of enquiry being that there's something about the way
Scala env behaves that might change the way dependency injection system we
rely on in our Java client is working.

--
You received this message because you are subscribed to the Google Groups "elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email to elasticsearch+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/0150ad84-1e39-440e-91a9-1f788e449aca%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Thanks, Felipe.

Problem reproduced and solution discovered. Please track
https://github.com/elasticsearch/elasticsearch/issues/7840 for progress.

On Monday, September 22, 2014 9:59:49 PM UTC+1, Felipe Hummel wrote:

Hi Mark, I've created isolated "mains" for Java and Scala:
https://gist.github.com/felipehummel/fbd005e6964ba546582d

Both give the same exactly errors and stacktraces (also in the gist) when
using my indices.
So it can't be a Scala (using the Java Client) problem.

Both are run by SBT (with run-main) but it shouldn't matter, at the end
SBT just calls "java -cp ...."

On Monday, September 22, 2014 8:16:08 AM UTC-3, Mark Harwood wrote:

..from a Scala app?

The current line of enquiry being that there's something about the way
Scala env behaves that might change the way dependency injection system we
rely on in our Java client is working.

--
You received this message because you are subscribed to the Google Groups "elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email to elasticsearch+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/49a554a5-9605-4652-a2a2-f4334e74a636%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Thanks Mark, glad I could help.

In the meantime, is there any workaround to force the module to be load
(without recompiling a ES .jar)?

On Tue, Sep 23, 2014 at 1:49 PM, Mark Harwood <
mark.harwood@elasticsearch.com> wrote:

Thanks, Felipe.

Problem reproduced and solution discovered. Please track
https://github.com/elasticsearch/elasticsearch/issues/7840 for progress.

On Monday, September 22, 2014 9:59:49 PM UTC+1, Felipe Hummel wrote:

Hi Mark, I've created isolated "mains" for Java and Scala:
https://gist.github.com/felipehummel/fbd005e6964ba546582d

Both give the same exactly errors and stacktraces (also in the gist) when
using my indices.
So it can't be a Scala (using the Java Client) problem.

Both are run by SBT (with run-main) but it shouldn't matter, at the end
SBT just calls "java -cp ...."

On Monday, September 22, 2014 8:16:08 AM UTC-3, Mark Harwood wrote:

..from a Scala app?

The current line of enquiry being that there's something about the way
Scala env behaves that might change the way dependency injection system we
rely on in our Java client is working.

--
You received this message because you are subscribed to a topic in the
Google Groups "elasticsearch" group.
To unsubscribe from this topic, visit
https://groups.google.com/d/topic/elasticsearch/R42Nyyfr73I/unsubscribe.
To unsubscribe from this group and all its topics, send an email to
elasticsearch+unsubscribe@googlegroups.com.
To view this discussion on the web visit
https://groups.google.com/d/msgid/elasticsearch/49a554a5-9605-4652-a2a2-f4334e74a636%40googlegroups.com
https://groups.google.com/d/msgid/elasticsearch/49a554a5-9605-4652-a2a2-f4334e74a636%40googlegroups.com?utm_medium=email&utm_source=footer
.

For more options, visit https://groups.google.com/d/optout.

--
You received this message because you are subscribed to the Google Groups "elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email to elasticsearch+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/CAPo%3Djtq3nqWSNAo1ZQp2nrQqq6Ap_R4pWmk0Ydqfe2K2731Hrw%40mail.gmail.com.
For more options, visit https://groups.google.com/d/optout.

I feel grubby even suggesting this...

  https://gist.github.com/markharwood/0cecd5019dbd5c4e90fc

but it looks to work.

--
You received this message because you are subscribed to the Google Groups "elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email to elasticsearch+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/862bd057-355a-451f-8a07-7d1463dd3089%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Slightly less grubby....

    SignificanceHeuristicStreams.registerStream(JLHScore.STREAM, 

JLHScore.STREAM.getName());

call before creating the TransportClient and repeat as necessary for other
choices of significance scoring heuristics like MutualInformation, GND and
ChiSquare.

On Tuesday, September 23, 2014 8:31:01 PM UTC+1, Mark Harwood wrote:

I feel grubby even suggesting this...

  https://gist.github.com/markharwood/0cecd5019dbd5c4e90fc

but it looks to work.

--
You received this message because you are subscribed to the Google Groups "elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email to elasticsearch+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/3fcac418-ea67-425e-a164-fdd34a718f16%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.