Analyzer is closed - ERROR


(Tomasz Romanczuk) #1

After starting node I try to refresh index setting (i.e. change analyzer),
but something goes wrong, I have an error:
2014-03-18 12:02:40,810 WARN [org.elasticsearch.index.indexing]
[alerts_node] [_percolator][0] post listener
[org.elasticsearch.index.percolator.PercolatorService$RealTimePercolat
orOperationListener@702f2591] failed
org.elasticsearch.ElasticSearchException: failed to parse query [316]
at
org.elasticsearch.index.percolator.PercolatorExecutor.parseQuery(PercolatorExecutor.java:361)
at
org.elasticsearch.index.percolator.PercolatorExecutor.addQuery(PercolatorExecutor.java:332)
at
org.elasticsearch.index.percolator.PercolatorService$RealTimePercolatorOperationListener.postIndexUnderLock(PercolatorService.java:295)
at
org.elasticsearch.index.indexing.ShardIndexingService.postIndexUnderLock(ShardIndexingService.java:140)
at
org.elasticsearch.index.engine.robin.RobinEngine.innerIndex(RobinEngine.java:594)
at
org.elasticsearch.index.engine.robin.RobinEngine.index(RobinEngine.java:492)
at
org.elasticsearch.index.shard.service.InternalIndexShard.performRecoveryOperation(InternalIndexShard.java:703)
at
org.elasticsearch.index.gateway.local.LocalIndexShardGateway.recover(LocalIndexShardGateway.java:224)
at
org.elasticsearch.index.gateway.IndexShardGatewayService$1.run(IndexShardGatewayService.java:174)
at
java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
at java.lang.Thread.run(Thread.java:619)
Caused by: org.apache.lucene.store.AlreadyClosedException: this Analyzer is
closed
at
org.apache.lucene.analysis.Analyzer$ReuseStrategy.getStoredValue(Analyzer.java:368)
at
org.apache.lucene.analysis.Analyzer$GlobalReuseStrategy.getReusableComponents(Analyzer.java:410)
at
org.apache.lucene.analysis.Analyzer.tokenStream(Analyzer.java:173)
at
org.elasticsearch.index.search.MatchQuery.parse(MatchQuery.java:203)
at
org.elasticsearch.index.query.MatchQueryParser.parse(MatchQueryParser.java:163)
at
org.elasticsearch.index.query.QueryParseContext.parseInnerQuery(QueryParseContext.java:207)
at
org.elasticsearch.index.query.BoolQueryParser.parse(BoolQueryParser.java:107)
at
org.elasticsearch.index.query.QueryParseContext.parseInnerQuery(QueryParseContext.java:207)
at
org.elasticsearch.index.query.BoolQueryParser.parse(BoolQueryParser.java:107)
at
org.elasticsearch.index.query.QueryParseContext.parseInnerQuery(QueryParseContext.java:207)
at
org.elasticsearch.index.query.BoolQueryParser.parse(BoolQueryParser.java:107)
at
org.elasticsearch.index.query.QueryParseContext.parseInnerQuery(QueryParseContext.java:207)
at
org.elasticsearch.index.query.BoolQueryParser.parse(BoolQueryParser.java:93)
at
org.elasticsearch.index.query.QueryParseContext.parseInnerQuery(QueryParseContext.java:207)
at
org.elasticsearch.index.query.IndexQueryParserService.parse(IndexQueryParserService.java:284)
at
org.elasticsearch.index.query.IndexQueryParserService.parse(IndexQueryParserService.java:255)
at
org.elasticsearch.index.percolator.PercolatorExecutor.parseQuery(PercolatorExecutor.java:350)

My code:
node = NodeBuilder.nodeBuilder().settings(builder).build();
node.start();
client = node.getClient();
client.admin().indices().prepareClose(INDEX_NAME).execute().actionGet();
UpdateSettingsRequestBuilder builder =
client.admin().indices().prepareUpdateSettings();
builder.setIndices(INDEX_NAME);
builder.setSettings(createSettings());
builder.execute().actionGet();
client.admin().indices().prepareOpen(INDEX_NAME).execute().actionGet();

private Builder createSettings() throws IOException {
XContentBuilder builder =
XContentFactory.jsonBuilder().startObject();
builder.startObject("analysis");
analyzer.appendSettings(builder);
builder.endObject();
builder.endObject();
return
ImmutableSettings.settingsBuilder().loadFromSource(builder.string());
}

where analyzer is a simple class which only adds hunspell dictionary and
my custom tokenizer.

The problem is that there is a thead making index recovery and during this
process I'm closing index. How can I avoid this situation? Is there any way
to check if recovery is in progress?

--
You received this message because you are subscribed to the Google Groups "elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email to elasticsearch+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/53cef11a-e5c7-4a23-9d2e-8431691f4f73%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


(Itamar Syn-Hershko) #2

Your analyzer implementation is probably faulty. Lucene 4.6 started being
more strict about analyzers lifecycle - I suggest you try it locally with
plain Lucene code to first verify its implementation follows the life cycle
rules.

Reference:
http://lucene.apache.org/core/4_6_0/core/org/apache/lucene/analysis/TokenStream.html

--

Itamar Syn-Hershko
http://code972.com | @synhershko https://twitter.com/synhershko
Freelance Developer & Consultant
Author of RavenDB in Action http://manning.com/synhershko/

On Tue, Mar 18, 2014 at 1:30 PM, Tomasz Romanczuk tomwid1983@gmail.comwrote:

After starting node I try to refresh index setting (i.e. change analyzer),
but something goes wrong, I have an error:
2014-03-18 12:02:40,810 WARN [org.elasticsearch.index.indexing]
[alerts_node] [_percolator][0] post listener
[org.elasticsearch.index.percolator.PercolatorService$RealTimePercolat
orOperationListener@702f2591] failed
org.elasticsearch.ElasticSearchException: failed to parse query [316]
at
org.elasticsearch.index.percolator.PercolatorExecutor.parseQuery(PercolatorExecutor.java:361)
at
org.elasticsearch.index.percolator.PercolatorExecutor.addQuery(PercolatorExecutor.java:332)
at
org.elasticsearch.index.percolator.PercolatorService$RealTimePercolatorOperationListener.postIndexUnderLock(PercolatorService.java:295)
at
org.elasticsearch.index.indexing.ShardIndexingService.postIndexUnderLock(ShardIndexingService.java:140)
at
org.elasticsearch.index.engine.robin.RobinEngine.innerIndex(RobinEngine.java:594)
at
org.elasticsearch.index.engine.robin.RobinEngine.index(RobinEngine.java:492)
at
org.elasticsearch.index.shard.service.InternalIndexShard.performRecoveryOperation(InternalIndexShard.java:703)
at
org.elasticsearch.index.gateway.local.LocalIndexShardGateway.recover(LocalIndexShardGateway.java:224)
at
org.elasticsearch.index.gateway.IndexShardGatewayService$1.run(IndexShardGatewayService.java:174)
at
java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
at java.lang.Thread.run(Thread.java:619)
Caused by: org.apache.lucene.store.AlreadyClosedException: this Analyzer
is closed
at
org.apache.lucene.analysis.Analyzer$ReuseStrategy.getStoredValue(Analyzer.java:368)
at
org.apache.lucene.analysis.Analyzer$GlobalReuseStrategy.getReusableComponents(Analyzer.java:410)
at
org.apache.lucene.analysis.Analyzer.tokenStream(Analyzer.java:173)
at
org.elasticsearch.index.search.MatchQuery.parse(MatchQuery.java:203)
at
org.elasticsearch.index.query.MatchQueryParser.parse(MatchQueryParser.java:163)
at
org.elasticsearch.index.query.QueryParseContext.parseInnerQuery(QueryParseContext.java:207)
at
org.elasticsearch.index.query.BoolQueryParser.parse(BoolQueryParser.java:107)
at
org.elasticsearch.index.query.QueryParseContext.parseInnerQuery(QueryParseContext.java:207)
at
org.elasticsearch.index.query.BoolQueryParser.parse(BoolQueryParser.java:107)
at
org.elasticsearch.index.query.QueryParseContext.parseInnerQuery(QueryParseContext.java:207)
at
org.elasticsearch.index.query.BoolQueryParser.parse(BoolQueryParser.java:107)
at
org.elasticsearch.index.query.QueryParseContext.parseInnerQuery(QueryParseContext.java:207)
at
org.elasticsearch.index.query.BoolQueryParser.parse(BoolQueryParser.java:93)
at
org.elasticsearch.index.query.QueryParseContext.parseInnerQuery(QueryParseContext.java:207)
at
org.elasticsearch.index.query.IndexQueryParserService.parse(IndexQueryParserService.java:284)
at
org.elasticsearch.index.query.IndexQueryParserService.parse(IndexQueryParserService.java:255)
at
org.elasticsearch.index.percolator.PercolatorExecutor.parseQuery(PercolatorExecutor.java:350)

My code:
node = NodeBuilder.nodeBuilder().settings(builder).build();
node.start();
client = node.getClient();
client.admin().indices().prepareClose(INDEX_NAME).execute().actionGet();
UpdateSettingsRequestBuilder builder =
client.admin().indices().prepareUpdateSettings();
builder.setIndices(INDEX_NAME);
builder.setSettings(createSettings());
builder.execute().actionGet();
client.admin().indices().prepareOpen(INDEX_NAME).execute().actionGet();

private Builder createSettings() throws IOException {
XContentBuilder builder =
XContentFactory.jsonBuilder().startObject();
builder.startObject("analysis");
analyzer.appendSettings(builder);
builder.endObject();
builder.endObject();
return
ImmutableSettings.settingsBuilder().loadFromSource(builder.string());
}

where analyzer is a simple class which only adds hunspell dictionary
and my custom tokenizer.

The problem is that there is a thead making index recovery and during this
process I'm closing index. How can I avoid this situation? Is there any way
to check if recovery is in progress?

--
You received this message because you are subscribed to the Google Groups
"elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an
email to elasticsearch+unsubscribe@googlegroups.com.
To view this discussion on the web visit
https://groups.google.com/d/msgid/elasticsearch/53cef11a-e5c7-4a23-9d2e-8431691f4f73%40googlegroups.comhttps://groups.google.com/d/msgid/elasticsearch/53cef11a-e5c7-4a23-9d2e-8431691f4f73%40googlegroups.com?utm_medium=email&utm_source=footer
.
For more options, visit https://groups.google.com/d/optout.

--
You received this message because you are subscribed to the Google Groups "elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email to elasticsearch+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/CAHTr4ZvqOs4z2tRPqDHezz51EpDjzdeT_%2B31EawC0bLa_pE7bg%40mail.gmail.com.
For more options, visit https://groups.google.com/d/optout.


(Tomasz Romanczuk) #3

It's quite simple class:
List filterNames = Lists.newArrayList();
builder.startObject(FILTER);
filterNames.add(FILTER_NAME_1);
builder.startObject(FILTER_NAME_1);
builder.field("type", "word_delimiter");
builder.array("type_table", ....);
builder.endObject();

    filterNames.add(FILTER_NAME_2);
    builder.startObject(FILTER_NAME_2);
    builder.field("type", "hunspell");
    builder.field("ignoreCase", "false");
    builder.field("locale", "da_DK");
    builder.endObject();

    builder.endObject();

    builder.startObject("analyzer");
    builder.startObject(NAME);
    builder.field("type", "custom");
    builder.field("tokenizer", "whitespace");
    builder.array(FILTER, filterNames.toArray(new 

String[filterNames.size()]));

    builder.endObject();
    builder.endObject();

What can be faulty? It properly analyses text. Problem occures only when I
restart module and try to refresh index setting (i.e. change dictionary
language).

W dniu wtorek, 18 marca 2014 12:51:28 UTC+1 użytkownik Itamar Syn-Hershko
napisał:

Your analyzer implementation is probably faulty. Lucene 4.6 started being
more strict about analyzers lifecycle - I suggest you try it locally with
plain Lucene code to first verify its implementation follows the life cycle
rules.

Reference:
http://lucene.apache.org/core/4_6_0/core/org/apache/lucene/analysis/TokenStream.htmlhttp://www.google.com/url?q=http%3A%2F%2Flucene.apache.org%2Fcore%2F4_6_0%2Fcore%2Forg%2Fapache%2Flucene%2Fanalysis%2FTokenStream.html&sa=D&sntz=1&usg=AFQjCNG3-c_lmcixxA0s0HmVDW4Q8bhqZA

--

Itamar Syn-Hershko
http://code972.com | @synhershko https://twitter.com/synhershko
Freelance Developer & Consultant
Author of RavenDB in Action http://manning.com/synhershko/

On Tue, Mar 18, 2014 at 1:30 PM, Tomasz Romanczuk <tomwi...@gmail.com<javascript:>

wrote:

After starting node I try to refresh index setting (i.e. change
analyzer), but something goes wrong, I have an error:
2014-03-18 12:02:40,810 WARN [org.elasticsearch.index.indexing]
[alerts_node] [_percolator][0] post listener
[org.elasticsearch.index.percolator.PercolatorService$RealTimePercolat
orOperationListener@702f2591] failed
org.elasticsearch.ElasticSearchException: failed to parse query [316]
at
org.elasticsearch.index.percolator.PercolatorExecutor.parseQuery(PercolatorExecutor.java:361)
at
org.elasticsearch.index.percolator.PercolatorExecutor.addQuery(PercolatorExecutor.java:332)
at
org.elasticsearch.index.percolator.PercolatorService$RealTimePercolatorOperationListener.postIndexUnderLock(PercolatorService.java:295)
at
org.elasticsearch.index.indexing.ShardIndexingService.postIndexUnderLock(ShardIndexingService.java:140)
at
org.elasticsearch.index.engine.robin.RobinEngine.innerIndex(RobinEngine.java:594)
at
org.elasticsearch.index.engine.robin.RobinEngine.index(RobinEngine.java:492)
at
org.elasticsearch.index.shard.service.InternalIndexShard.performRecoveryOperation(InternalIndexShard.java:703)
at
org.elasticsearch.index.gateway.local.LocalIndexShardGateway.recover(LocalIndexShardGateway.java:224)
at
org.elasticsearch.index.gateway.IndexShardGatewayService$1.run(IndexShardGatewayService.java:174)
at
java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
at java.lang.Thread.run(Thread.java:619)
Caused by: org.apache.lucene.store.AlreadyClosedException: this Analyzer
is closed
at
org.apache.lucene.analysis.Analyzer$ReuseStrategy.getStoredValue(Analyzer.java:368)
at
org.apache.lucene.analysis.Analyzer$GlobalReuseStrategy.getReusableComponents(Analyzer.java:410)
at
org.apache.lucene.analysis.Analyzer.tokenStream(Analyzer.java:173)
at
org.elasticsearch.index.search.MatchQuery.parse(MatchQuery.java:203)
at
org.elasticsearch.index.query.MatchQueryParser.parse(MatchQueryParser.java:163)
at
org.elasticsearch.index.query.QueryParseContext.parseInnerQuery(QueryParseContext.java:207)
at
org.elasticsearch.index.query.BoolQueryParser.parse(BoolQueryParser.java:107)
at
org.elasticsearch.index.query.QueryParseContext.parseInnerQuery(QueryParseContext.java:207)
at
org.elasticsearch.index.query.BoolQueryParser.parse(BoolQueryParser.java:107)
at
org.elasticsearch.index.query.QueryParseContext.parseInnerQuery(QueryParseContext.java:207)
at
org.elasticsearch.index.query.BoolQueryParser.parse(BoolQueryParser.java:107)
at
org.elasticsearch.index.query.QueryParseContext.parseInnerQuery(QueryParseContext.java:207)
at
org.elasticsearch.index.query.BoolQueryParser.parse(BoolQueryParser.java:93)
at
org.elasticsearch.index.query.QueryParseContext.parseInnerQuery(QueryParseContext.java:207)
at
org.elasticsearch.index.query.IndexQueryParserService.parse(IndexQueryParserService.java:284)
at
org.elasticsearch.index.query.IndexQueryParserService.parse(IndexQueryParserService.java:255)
at
org.elasticsearch.index.percolator.PercolatorExecutor.parseQuery(PercolatorExecutor.java:350)

My code:
node = NodeBuilder.nodeBuilder().settings(builder).build();
node.start();
client = node.getClient();
client.admin().indices().prepareClose(INDEX_NAME).execute().actionGet();
UpdateSettingsRequestBuilder builder =
client.admin().indices().prepareUpdateSettings();
builder.setIndices(INDEX_NAME);
builder.setSettings(createSettings());
builder.execute().actionGet();
client.admin().indices().prepareOpen(INDEX_NAME).execute().actionGet();

private Builder createSettings() throws IOException {
XContentBuilder builder =
XContentFactory.jsonBuilder().startObject();
builder.startObject("analysis");
analyzer.appendSettings(builder);
builder.endObject();
builder.endObject();
return
ImmutableSettings.settingsBuilder().loadFromSource(builder.string());
}

where analyzer is a simple class which only adds hunspell dictionary
and my custom tokenizer.

The problem is that there is a thead making index recovery and during
this process I'm closing index. How can I avoid this situation? Is there
any way to check if recovery is in progress?

--
You received this message because you are subscribed to the Google Groups
"elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an
email to elasticsearc...@googlegroups.com <javascript:>.
To view this discussion on the web visit
https://groups.google.com/d/msgid/elasticsearch/53cef11a-e5c7-4a23-9d2e-8431691f4f73%40googlegroups.comhttps://groups.google.com/d/msgid/elasticsearch/53cef11a-e5c7-4a23-9d2e-8431691f4f73%40googlegroups.com?utm_medium=email&utm_source=footer
.
For more options, visit https://groups.google.com/d/optout.

--
You received this message because you are subscribed to the Google Groups "elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email to elasticsearch+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/2850d6c1-ff36-4a81-ac67-e2fd5883ff65%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


(Itamar Syn-Hershko) #4

Did you write the analyzer that gets run on the server, or are you simply
assembling an analysis chain from client without any custom coding on the
server side?

--

Itamar Syn-Hershko
http://code972.com | @synhershko https://twitter.com/synhershko
Freelance Developer & Consultant
Author of RavenDB in Action http://manning.com/synhershko/

On Tue, Mar 18, 2014 at 2:18 PM, Tomasz Romanczuk tomwid1983@gmail.comwrote:

It's quite simple class:
List filterNames = Lists.newArrayList();
builder.startObject(FILTER);
filterNames.add(FILTER_NAME_1);
builder.startObject(FILTER_NAME_1);
builder.field("type", "word_delimiter");
builder.array("type_table", ....);
builder.endObject();

    filterNames.add(FILTER_NAME_2);
    builder.startObject(FILTER_NAME_2);
    builder.field("type", "hunspell");
    builder.field("ignoreCase", "false");
    builder.field("locale", "da_DK");
    builder.endObject();

    builder.endObject();

    builder.startObject("analyzer");
    builder.startObject(NAME);
    builder.field("type", "custom");
    builder.field("tokenizer", "whitespace");
    builder.array(FILTER, filterNames.toArray(new

String[filterNames.size()]));

    builder.endObject();
    builder.endObject();

What can be faulty? It properly analyses text. Problem occures only when I
restart module and try to refresh index setting (i.e. change dictionary
language).

W dniu wtorek, 18 marca 2014 12:51:28 UTC+1 użytkownik Itamar Syn-Hershko
napisał:

Your analyzer implementation is probably faulty. Lucene 4.6 started being
more strict about analyzers lifecycle - I suggest you try it locally with
plain Lucene code to first verify its implementation follows the life cycle
rules.

Reference: http://lucene.apache.org/core/4_6_0/core/
org/apache/lucene/analysis/TokenStream.htmlhttp://www.google.com/url?q=http%3A%2F%2Flucene.apache.org%2Fcore%2F4_6_0%2Fcore%2Forg%2Fapache%2Flucene%2Fanalysis%2FTokenStream.html&sa=D&sntz=1&usg=AFQjCNG3-c_lmcixxA0s0HmVDW4Q8bhqZA

--

Itamar Syn-Hershko
http://code972.com | @synhershko https://twitter.com/synhershko
Freelance Developer & Consultant
Author of RavenDB in Action http://manning.com/synhershko/

On Tue, Mar 18, 2014 at 1:30 PM, Tomasz Romanczuk tomwi...@gmail.comwrote:

After starting node I try to refresh index setting (i.e. change
analyzer), but something goes wrong, I have an error:
2014-03-18 12:02:40,810 WARN [org.elasticsearch.index.indexing]
[alerts_node] [_percolator][0] post listener [org.elasticsearch.index.
percolator.PercolatorService$RealTimePercolat
orOperationListener@702f2591] failed
org.elasticsearch.ElasticSearchException: failed to parse query [316]
at org.elasticsearch.index.percolator.PercolatorExecutor.
parseQuery(PercolatorExecutor.java:361)
at org.elasticsearch.index.percolator.PercolatorExecutor.
addQuery(PercolatorExecutor.java:332)
at org.elasticsearch.index.percolator.PercolatorService$
RealTimePercolatorOperationListener.postIndexUnderLock(
PercolatorService.java:295)
at org.elasticsearch.index.indexing.ShardIndexingService.
postIndexUnderLock(ShardIndexingService.java:140)
at org.elasticsearch.index.engine.robin.RobinEngine.
innerIndex(RobinEngine.java:594)
at org.elasticsearch.index.engine.robin.RobinEngine.
index(RobinEngine.java:492)
at org.elasticsearch.index.shard.service.InternalIndexShard.
performRecoveryOperation(InternalIndexShard.java:703)
at org.elasticsearch.index.gateway.local.LocalIndexShardGateway.
recover(LocalIndexShardGateway.java:224)
at org.elasticsearch.index.gateway.IndexShardGatewayService$1.
run(IndexShardGatewayService.java:174)
at java.util.concurrent.ThreadPoolExecutor$Worker.
runTask(ThreadPoolExecutor.java:886)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(
ThreadPoolExecutor.java:908)
at java.lang.Thread.run(Thread.java:619)
Caused by: org.apache.lucene.store.AlreadyClosedException: this
Analyzer is closed
at org.apache.lucene.analysis.Analyzer$ReuseStrategy.
getStoredValue(Analyzer.java:368)
at org.apache.lucene.analysis.Analyzer$GlobalReuseStrategy.
getReusableComponents(Analyzer.java:410)
at org.apache.lucene.analysis.Analyzer.tokenStream(Analyzer.
java:173)
at org.elasticsearch.index.search.MatchQuery.parse(
MatchQuery.java:203)
at org.elasticsearch.index.query.MatchQueryParser.parse(
MatchQueryParser.java:163)
at org.elasticsearch.index.query.QueryParseContext.
parseInnerQuery(QueryParseContext.java:207)
at org.elasticsearch.index.query.BoolQueryParser.parse(
BoolQueryParser.java:107)
at org.elasticsearch.index.query.QueryParseContext.
parseInnerQuery(QueryParseContext.java:207)
at org.elasticsearch.index.query.BoolQueryParser.parse(
BoolQueryParser.java:107)
at org.elasticsearch.index.query.QueryParseContext.
parseInnerQuery(QueryParseContext.java:207)
at org.elasticsearch.index.query.BoolQueryParser.parse(
BoolQueryParser.java:107)
at org.elasticsearch.index.query.QueryParseContext.
parseInnerQuery(QueryParseContext.java:207)
at org.elasticsearch.index.query.BoolQueryParser.parse(
BoolQueryParser.java:93)
at org.elasticsearch.index.query.QueryParseContext.
parseInnerQuery(QueryParseContext.java:207)
at org.elasticsearch.index.query.IndexQueryParserService.parse(
IndexQueryParserService.java:284)
at org.elasticsearch.index.query.IndexQueryParserService.parse(
IndexQueryParserService.java:255)
at org.elasticsearch.index.percolator.PercolatorExecutor.
parseQuery(PercolatorExecutor.java:350)

My code:
node = NodeBuilder.nodeBuilder().settings(builder).build();
node.start();
client = node.getClient();
client.admin().indices().prepareClose(INDEX_NAME).execute().actionGet();
UpdateSettingsRequestBuilder builder = client.admin().indices().
prepareUpdateSettings();
builder.setIndices(INDEX_NAME);
builder.setSettings(createSettings());
builder.execute().actionGet();
client.admin().indices().prepareOpen(INDEX_NAME).execute().actionGet();

private Builder createSettings() throws IOException {
XContentBuilder builder = XContentFactory.jsonBuilder().
startObject();
builder.startObject("analysis");
analyzer.appendSettings(builder);
builder.endObject();
builder.endObject();
return ImmutableSettings.settingsBuilder().
loadFromSource(builder.string());
}

where analyzer is a simple class which only adds hunspell dictionary
and my custom tokenizer.

The problem is that there is a thead making index recovery and during
this process I'm closing index. How can I avoid this situation? Is there
any way to check if recovery is in progress?

--
You received this message because you are subscribed to the Google
Groups "elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send
an email to elasticsearc...@googlegroups.com.

To view this discussion on the web visit https://groups.google.com/d/
msgid/elasticsearch/53cef11a-e5c7-4a23-9d2e-8431691f4f73%
40googlegroups.comhttps://groups.google.com/d/msgid/elasticsearch/53cef11a-e5c7-4a23-9d2e-8431691f4f73%40googlegroups.com?utm_medium=email&utm_source=footer
.
For more options, visit https://groups.google.com/d/optout.

--
You received this message because you are subscribed to the Google Groups
"elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an
email to elasticsearch+unsubscribe@googlegroups.com.
To view this discussion on the web visit
https://groups.google.com/d/msgid/elasticsearch/2850d6c1-ff36-4a81-ac67-e2fd5883ff65%40googlegroups.comhttps://groups.google.com/d/msgid/elasticsearch/2850d6c1-ff36-4a81-ac67-e2fd5883ff65%40googlegroups.com?utm_medium=email&utm_source=footer
.

For more options, visit https://groups.google.com/d/optout.

--
You received this message because you are subscribed to the Google Groups "elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email to elasticsearch+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/CAHTr4ZvqAyxtA3AWbu7KOnfTAeUArSqyhO72sJFGS12cM2dAYw%40mail.gmail.com.
For more options, visit https://groups.google.com/d/optout.


(Tomasz Romanczuk) #5

I don't have any custom code. My analyzer uses only tokenizer ( *whitespaces

  • 3 special characters: ( ) -* ) and hunspell for danish language. All I do
    is in my previous post.

W dniu wtorek, 18 marca 2014 13:29:04 UTC+1 użytkownik Itamar Syn-Hershko
napisał:

Did you write the analyzer that gets run on the server, or are you simply
assembling an analysis chain from client without any custom coding on the
server side?

--

Itamar Syn-Hershko
http://code972.com | @synhershko https://twitter.com/synhershko
Freelance Developer & Consultant
Author of RavenDB in Action http://manning.com/synhershko/

On Tue, Mar 18, 2014 at 2:18 PM, Tomasz Romanczuk <tomwi...@gmail.com<javascript:>

wrote:

It's quite simple class:
List filterNames = Lists.newArrayList();
builder.startObject(FILTER);
filterNames.add(FILTER_NAME_1);
builder.startObject(FILTER_NAME_1);
builder.field("type", "word_delimiter");
builder.array("type_table", ....);
builder.endObject();

    filterNames.add(FILTER_NAME_2);
    builder.startObject(FILTER_NAME_2);
    builder.field("type", "hunspell");
    builder.field("ignoreCase", "false");
    builder.field("locale", "da_DK");
    builder.endObject();

    builder.endObject();

    builder.startObject("analyzer");
    builder.startObject(NAME);
    builder.field("type", "custom");
    builder.field("tokenizer", "whitespace");
    builder.array(FILTER, filterNames.toArray(new 

String[filterNames.size()]));

    builder.endObject();
    builder.endObject();

What can be faulty? It properly analyses text. Problem occures only when
I restart module and try to refresh index setting (i.e. change dictionary
language).

W dniu wtorek, 18 marca 2014 12:51:28 UTC+1 użytkownik Itamar Syn-Hershko
napisał:

Your analyzer implementation is probably faulty. Lucene 4.6 started
being more strict about analyzers lifecycle - I suggest you try it locally
with plain Lucene code to first verify its implementation follows the life
cycle rules.

Reference: http://lucene.apache.org/core/4_6_0/core/
org/apache/lucene/analysis/TokenStream.htmlhttp://www.google.com/url?q=http%3A%2F%2Flucene.apache.org%2Fcore%2F4_6_0%2Fcore%2Forg%2Fapache%2Flucene%2Fanalysis%2FTokenStream.html&sa=D&sntz=1&usg=AFQjCNG3-c_lmcixxA0s0HmVDW4Q8bhqZA

--

Itamar Syn-Hershko
http://code972.com | @synhershko https://twitter.com/synhershko
Freelance Developer & Consultant
Author of RavenDB in Action http://manning.com/synhershko/

On Tue, Mar 18, 2014 at 1:30 PM, Tomasz Romanczuk tomwi...@gmail.comwrote:

After starting node I try to refresh index setting (i.e. change
analyzer), but something goes wrong, I have an error:
2014-03-18 12:02:40,810 WARN [org.elasticsearch.index.indexing]
[alerts_node] [_percolator][0] post listener [org.elasticsearch.index.
percolator.PercolatorService$RealTimePercolat
orOperationListener@702f2591] failed
org.elasticsearch.ElasticSearchException: failed to parse query [316]
at org.elasticsearch.index.percolator.PercolatorExecutor.
parseQuery(PercolatorExecutor.java:361)
at org.elasticsearch.index.percolator.PercolatorExecutor.
addQuery(PercolatorExecutor.java:332)
at org.elasticsearch.index.percolator.PercolatorService$
RealTimePercolatorOperationListener.postIndexUnderLock(
PercolatorService.java:295)
at org.elasticsearch.index.indexing.ShardIndexingService.
postIndexUnderLock(ShardIndexingService.java:140)
at org.elasticsearch.index.engine.robin.RobinEngine.
innerIndex(RobinEngine.java:594)
at org.elasticsearch.index.engine.robin.RobinEngine.
index(RobinEngine.java:492)
at org.elasticsearch.index.shard.service.InternalIndexShard.
performRecoveryOperation(InternalIndexShard.java:703)
at org.elasticsearch.index.gateway.local.
LocalIndexShardGateway.recover(LocalIndexShardGateway.java:224)
at org.elasticsearch.index.gateway.IndexShardGatewayService$1.
run(IndexShardGatewayService.java:174)
at java.util.concurrent.ThreadPoolExecutor$Worker.
runTask(ThreadPoolExecutor.java:886)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(
ThreadPoolExecutor.java:908)
at java.lang.Thread.run(Thread.java:619)
Caused by: org.apache.lucene.store.AlreadyClosedException: this
Analyzer is closed
at org.apache.lucene.analysis.Analyzer$ReuseStrategy.
getStoredValue(Analyzer.java:368)
at org.apache.lucene.analysis.Analyzer$GlobalReuseStrategy.
getReusableComponents(Analyzer.java:410)
at org.apache.lucene.analysis.Analyzer.tokenStream(Analyzer.
java:173)
at org.elasticsearch.index.search.MatchQuery.parse(
MatchQuery.java:203)
at org.elasticsearch.index.query.MatchQueryParser.parse(
MatchQueryParser.java:163)
at org.elasticsearch.index.query.QueryParseContext.
parseInnerQuery(QueryParseContext.java:207)
at org.elasticsearch.index.query.BoolQueryParser.parse(
BoolQueryParser.java:107)
at org.elasticsearch.index.query.QueryParseContext.
parseInnerQuery(QueryParseContext.java:207)
at org.elasticsearch.index.query.BoolQueryParser.parse(
BoolQueryParser.java:107)
at org.elasticsearch.index.query.QueryParseContext.
parseInnerQuery(QueryParseContext.java:207)
at org.elasticsearch.index.query.BoolQueryParser.parse(
BoolQueryParser.java:107)
at org.elasticsearch.index.query.QueryParseContext.
parseInnerQuery(QueryParseContext.java:207)
at org.elasticsearch.index.query.BoolQueryParser.parse(
BoolQueryParser.java:93)
at org.elasticsearch.index.query.QueryParseContext.
parseInnerQuery(QueryParseContext.java:207)
at org.elasticsearch.index.query.IndexQueryParserService.parse(
IndexQueryParserService.java:284)
at org.elasticsearch.index.query.IndexQueryParserService.parse(
IndexQueryParserService.java:255)
at org.elasticsearch.index.percolator.PercolatorExecutor.
parseQuery(PercolatorExecutor.java:350)

My code:
node = NodeBuilder.nodeBuilder().settings(builder).build();
node.start();
client = node.getClient();
client.admin().indices().prepareClose(INDEX_NAME).
execute().actionGet();
UpdateSettingsRequestBuilder builder = client.admin().indices().
prepareUpdateSettings();
builder.setIndices(INDEX_NAME);
builder.setSettings(createSettings());
builder.execute().actionGet();
client.admin().indices().prepareOpen(INDEX_NAME).
execute().actionGet();

private Builder createSettings() throws IOException {
XContentBuilder builder = XContentFactory.jsonBuilder().
startObject();
builder.startObject("analysis");
analyzer.appendSettings(builder);
builder.endObject();
builder.endObject();
return ImmutableSettings.settingsBuilder().
loadFromSource(builder.string());
}

where analyzer is a simple class which only adds hunspell dictionary
and my custom tokenizer.

The problem is that there is a thead making index recovery and during
this process I'm closing index. How can I avoid this situation? Is there
any way to check if recovery is in progress?

--
You received this message because you are subscribed to the Google
Groups "elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send
an email to elasticsearc...@googlegroups.com.

To view this discussion on the web visit https://groups.google.com/d/
msgid/elasticsearch/53cef11a-e5c7-4a23-9d2e-8431691f4f73%
40googlegroups.comhttps://groups.google.com/d/msgid/elasticsearch/53cef11a-e5c7-4a23-9d2e-8431691f4f73%40googlegroups.com?utm_medium=email&utm_source=footer
.
For more options, visit https://groups.google.com/d/optout.

--
You received this message because you are subscribed to the Google Groups
"elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an
email to elasticsearc...@googlegroups.com <javascript:>.
To view this discussion on the web visit
https://groups.google.com/d/msgid/elasticsearch/2850d6c1-ff36-4a81-ac67-e2fd5883ff65%40googlegroups.comhttps://groups.google.com/d/msgid/elasticsearch/2850d6c1-ff36-4a81-ac67-e2fd5883ff65%40googlegroups.com?utm_medium=email&utm_source=footer
.

For more options, visit https://groups.google.com/d/optout.

--
You received this message because you are subscribed to the Google Groups "elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email to elasticsearch+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/9a0d03ed-31d5-4ada-8b7b-17af63577023%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


(Itamar Syn-Hershko) #6

This could be a bug in the percolator then - I'd open an issue on github
with a minimal repro

--

Itamar Syn-Hershko
http://code972.com | @synhershko https://twitter.com/synhershko
Freelance Developer & Consultant
Author of RavenDB in Action http://manning.com/synhershko/

On Tue, Mar 18, 2014 at 2:41 PM, Tomasz Romanczuk tomwid1983@gmail.comwrote:

I don't have any custom code. My analyzer uses only tokenizer ( *whitespaces

  • 3 special characters: ( ) -* ) and hunspell for danish language. All I
    do is in my previous post.

W dniu wtorek, 18 marca 2014 13:29:04 UTC+1 użytkownik Itamar Syn-Hershko
napisał:

Did you write the analyzer that gets run on the server, or are you simply
assembling an analysis chain from client without any custom coding on the
server side?

--

Itamar Syn-Hershko
http://code972.com | @synhershko https://twitter.com/synhershko
Freelance Developer & Consultant
Author of RavenDB in Action http://manning.com/synhershko/

On Tue, Mar 18, 2014 at 2:18 PM, Tomasz Romanczuk tomwi...@gmail.comwrote:

It's quite simple class:
List filterNames = Lists.newArrayList();
builder.startObject(FILTER);
filterNames.add(FILTER_NAME_1);
builder.startObject(FILTER_NAME_1);
builder.field("type", "word_delimiter");
builder.array("type_table", ....);
builder.endObject();

    filterNames.add(FILTER_NAME_2);
    builder.startObject(FILTER_NAME_2);
    builder.field("type", "hunspell");
    builder.field("ignoreCase", "false");
    builder.field("locale", "da_DK");
    builder.endObject();

    builder.endObject();

    builder.startObject("analyzer");
    builder.startObject(NAME);
    builder.field("type", "custom");
    builder.field("tokenizer", "whitespace");
    builder.array(FILTER, filterNames.toArray(new

String[filterNames.size()]));

    builder.endObject();
    builder.endObject();

What can be faulty? It properly analyses text. Problem occures only when
I restart module and try to refresh index setting (i.e. change dictionary
language).

W dniu wtorek, 18 marca 2014 12:51:28 UTC+1 użytkownik Itamar
Syn-Hershko napisał:

Your analyzer implementation is probably faulty. Lucene 4.6 started
being more strict about analyzers lifecycle - I suggest you try it locally
with plain Lucene code to first verify its implementation follows the life
cycle rules.

Reference: http://lucene.apache.org/core/4_6_0/core/org/
apache/lucene/analysis/TokenStream.htmlhttp://www.google.com/url?q=http%3A%2F%2Flucene.apache.org%2Fcore%2F4_6_0%2Fcore%2Forg%2Fapache%2Flucene%2Fanalysis%2FTokenStream.html&sa=D&sntz=1&usg=AFQjCNG3-c_lmcixxA0s0HmVDW4Q8bhqZA

--

Itamar Syn-Hershko
http://code972.com | @synhershko https://twitter.com/synhershko
Freelance Developer & Consultant
Author of RavenDB in Action http://manning.com/synhershko/

On Tue, Mar 18, 2014 at 1:30 PM, Tomasz Romanczuk tomwi...@gmail.comwrote:

After starting node I try to refresh index setting (i.e. change
analyzer), but something goes wrong, I have an error:
2014-03-18 12:02:40,810 WARN [org.elasticsearch.index.indexing]
[alerts_node] [_percolator][0] post listener [org.elasticsearch.index.
percolator.PercolatorService$RealTimePercolat
orOperationListener@702f2591] failed
org.elasticsearch.ElasticSearchException: failed to parse query [316]
at org.elasticsearch.index.percolator.PercolatorExecutor.
parseQuery(PercolatorExecutor.java:361)
at org.elasticsearch.index.percolator.PercolatorExecutor.
addQuery(PercolatorExecutor.java:332)
at org.elasticsearch.index.percolator.PercolatorService$
RealTimePercolatorOperationListener.postIndexUnderLock(PercolatorS
ervice.java:295)
at org.elasticsearch.index.indexing.ShardIndexingService.
postIndexUnderLock(ShardIndexingService.java:140)
at org.elasticsearch.index.engine.robin.RobinEngine.
innerIndex(RobinEngine.java:594)
at org.elasticsearch.index.engine.robin.RobinEngine.index(
RobinEngine.java:492)
at org.elasticsearch.index.shard.service.InternalIndexShard.
performRecoveryOperation(InternalIndexShard.java:703)
at org.elasticsearch.index.gateway.local.
LocalIndexShardGateway.recover(LocalIndexShardGateway.java:224)
at org.elasticsearch.index.gateway.IndexShardGatewayService$1.
run(IndexShardGatewayService.java:174)
at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(
ThreadPoolExecutor.java:886)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(
ThreadPoolExecutor.java:908)
at java.lang.Thread.run(Thread.java:619)
Caused by: org.apache.lucene.store.AlreadyClosedException: this
Analyzer is closed
at org.apache.lucene.analysis.Analyzer$ReuseStrategy.
getStoredValue(Analyzer.java:368)
at org.apache.lucene.analysis.Analyzer$GlobalReuseStrategy.
getReusableComponents(Analyzer.java:410)
at org.apache.lucene.analysis.Analyzer.tokenStream(Analyzer.
java:173)
at org.elasticsearch.index.search.MatchQuery.parse(MatchQuery.
java:203)
at org.elasticsearch.index.query.MatchQueryParser.parse(
MatchQueryParser.java:163)
at org.elasticsearch.index.query.QueryParseContext.
parseInnerQuery(QueryParseContext.java:207)
at org.elasticsearch.index.query.BoolQueryParser.parse(
BoolQueryParser.java:107)
at org.elasticsearch.index.query.QueryParseContext.
parseInnerQuery(QueryParseContext.java:207)
at org.elasticsearch.index.query.BoolQueryParser.parse(
BoolQueryParser.java:107)
at org.elasticsearch.index.query.QueryParseContext.
parseInnerQuery(QueryParseContext.java:207)
at org.elasticsearch.index.query.BoolQueryParser.parse(
BoolQueryParser.java:107)
at org.elasticsearch.index.query.QueryParseContext.
parseInnerQuery(QueryParseContext.java:207)
at org.elasticsearch.index.query.BoolQueryParser.parse(
BoolQueryParser.java:93)
at org.elasticsearch.index.query.QueryParseContext.
parseInnerQuery(QueryParseContext.java:207)
at org.elasticsearch.index.query.
IndexQueryParserService.parse(IndexQueryParserService.java:284)
at org.elasticsearch.index.query.
IndexQueryParserService.parse(IndexQueryParserService.java:255)
at org.elasticsearch.index.percolator.PercolatorExecutor.
parseQuery(PercolatorExecutor.java:350)

My code:
node = NodeBuilder.nodeBuilder().settings(builder).build();
node.start();
client = node.getClient();
client.admin().indices().prepareClose(INDEX_NAME).execute().
actionGet();
UpdateSettingsRequestBuilder builder = client.admin().indices().prepa
reUpdateSettings();
builder.setIndices(INDEX_NAME);
builder.setSettings(createSettings());
builder.execute().actionGet();
client.admin().indices().prepareOpen(INDEX_NAME).execute().
actionGet();

private Builder createSettings() throws IOException {
XContentBuilder builder = XContentFactory.jsonBuilder().
startObject();
builder.startObject("analysis");
analyzer.appendSettings(builder);
builder.endObject();
builder.endObject();
return ImmutableSettings.settingsBuilder().
loadFromSource(builder.string());
}

where analyzer is a simple class which only adds hunspell
dictionary and my custom tokenizer.

The problem is that there is a thead making index recovery and during
this process I'm closing index. How can I avoid this situation? Is there
any way to check if recovery is in progress?

--
You received this message because you are subscribed to the Google
Groups "elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send
an email to elasticsearc...@googlegroups.com.

To view this discussion on the web visit https://groups.google.com/d/
msgid/elasticsearch/53cef11a-e5c7-4a23-9d2e-8431691f4f73%40goo
glegroups.comhttps://groups.google.com/d/msgid/elasticsearch/53cef11a-e5c7-4a23-9d2e-8431691f4f73%40googlegroups.com?utm_medium=email&utm_source=footer
.
For more options, visit https://groups.google.com/d/optout.

--
You received this message because you are subscribed to the Google
Groups "elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send
an email to elasticsearc...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/
msgid/elasticsearch/2850d6c1-ff36-4a81-ac67-e2fd5883ff65%
40googlegroups.comhttps://groups.google.com/d/msgid/elasticsearch/2850d6c1-ff36-4a81-ac67-e2fd5883ff65%40googlegroups.com?utm_medium=email&utm_source=footer
.

For more options, visit https://groups.google.com/d/optout.

--
You received this message because you are subscribed to the Google Groups
"elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an
email to elasticsearch+unsubscribe@googlegroups.com.
To view this discussion on the web visit
https://groups.google.com/d/msgid/elasticsearch/9a0d03ed-31d5-4ada-8b7b-17af63577023%40googlegroups.comhttps://groups.google.com/d/msgid/elasticsearch/9a0d03ed-31d5-4ada-8b7b-17af63577023%40googlegroups.com?utm_medium=email&utm_source=footer
.

For more options, visit https://groups.google.com/d/optout.

--
You received this message because you are subscribed to the Google Groups "elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email to elasticsearch+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/CAHTr4ZvnZiLz0eAT3HnBJELv7_wGz5rsnn5q2atpX69NZRNeWw%40mail.gmail.com.
For more options, visit https://groups.google.com/d/optout.


(system) #7