Jorg,
That is exactly the kind of thing I'm looking for.
I'm having a little bit of difficulty getting it to do what I want.
I want to "push" an index to another index and change the mapping.
I can import / export okay but the push is having difficulty picking up the
new mappings.
The syntax for push seems to be to specify the name of the mapping file
which in may case is in /tmp/testpu_doc_mapping.json
and this contains:
{
"doc": {
"_timestamp": {
"enabled": true,
"store": true,
"path": "date"
},
"properties": {
"date": {
"type": "date",
"format": "dateOptionalTime"
},
"sentence": {
"type": "string",
}
Note I want sentence to be not_analyzed
Maybe syntax of above file is not correct?
I tried other variations.
And when it says add mapping _default : that's probably not a good sign?
I then issue command:
curl -XPOST
'localhost:9200/test/_push?map={"test":"testpu"}&{"test_doc_mapping":"/tmp/testpu_doc_mapping.json"}'
But this is clearly wrong
Server shows:
[2014-10-19 01:10:34,216][INFO ][BaseTransportClient ] creating
transport client, java version 1.7.0_40, effective
settings {host=localhost, port=9300, cluster.name=elasticsearch,
timeout=30s, client.transport.sniff=true, client.transp
ort.ping_timeout=30s, client.transport.ignore_cluster_name=true,
path.plugins=.dontexist}
[2014-10-19 01:10:34,218][INFO ][plugins ] [Left Hand]
loaded , sites
[2014-10-19 01:10:34,238][INFO ][BaseTransportClient ] transport
client settings = {host=localhost, port=9300, clus
ter.name=elasticsearch, timeout=30s, client.transport.sniff=true,
client.transport.ping_timeout=30s, client.transport.ig
nore_cluster_name=true, path.plugins=.dontexist,
path.home=C:\elasticsearch-1.3.4, name=Left Hand, path.logs=C:/elastics
earch-1.3.4/logs, network.server=false, node.client=true}
[2014-10-19 01:10:34,239][INFO ][BaseTransportClient ] adding custom
address for transport client: inet[localhost/1
27.0.0.1:9300]
[2014-10-19 01:10:34,246][INFO ][BaseTransportClient ] configured
addresses to connect = [inet[localhost/127.0.0.1:
9300]], waiting for 30 seconds to connect ...
[2014-10-19 01:11:04,247][INFO ][BaseTransportClient ] connected
nodes = [[Logan][-4NzM7wxQ6S8IEK-aOST1Q][zippity][ine
t[/192.168.43.250:9300]],
[#transport#-1][zippity][inet[localhost/127.0.0.1:9300]]]
[2014-10-19 01:11:04,247][INFO ][BaseTransportClient ] new connection
to [Logan][-4NzM7wxQ6S8IEK-aOST1Q][zippity][inet
[/192.168.43.250:9300]]
[2014-10-19 01:11:04,248][INFO ][BaseTransportClient ] new connection
to [#transport#-1][zippity][inet[localhost/127.0
.0.1:9300]]
[2014-10-19 01:11:04,248][INFO ][BaseTransportClient ] trying to
discover more nodes...
[2014-10-19 01:11:04,254][INFO ][BaseTransportClient ] adding
discovered node [Logan][-4NzM7wxQ6S8IEK-aOST1Q][zippity]
[inet[/192.168.43.250:9300]]
[2014-10-19 01:11:04,258][INFO ][BaseTransportClient ] ... discovery
done
[2014-10-19 01:11:04,259][INFO ][KnapsackService ] add:
plugin.knapsack.export.state ->
[2014-10-19 01:11:04,259][INFO ][KnapsackPushAction ] start of push:
{"mode":"push","started":"2014-10-19T00:11:04
.259Z","node_name":"Logan"}
[2014-10-19 01:11:04,259][INFO ][KnapsackService ] update cluster
settings: plugin.knapsack.export.state -> [{"
mode":"push","started":"2014-10-19T00:11:04.259Z","node_name":"Logan"}]
[2014-10-19 01:11:04,259][INFO ][KnapsackPushAction ]
map={test=testpu}
[2014-10-19 01:11:04,260][INFO ][KnapsackPushAction ] getting
settings for indices [test]
[2014-10-19 01:11:04,261][INFO ][KnapsackPushAction ] found indices:
[test]
[2014-10-19 01:11:04,261][INFO ][KnapsackPushAction ] getting
mappings for index test and types
[2014-10-19 01:11:04,262][INFO ][KnapsackPushAction ] found
mappings: [default, doc]
[2014-10-19 01:11:04,263][INFO ][KnapsackPushAction ] adding
mapping: default
[2014-10-19 01:11:04,263][INFO ][KnapsackPushAction ] adding
mapping: doc
[2014-10-19 01:11:04,263][INFO ][KnapsackPushAction ] creating
index: testpu
[2014-10-19 01:11:04,296][INFO ][cluster.metadata ] [Logan]
[testpu] creating index, cause [api], shards [5]/[1]
, mappings [default, doc]
[2014-10-19 01:11:04,374][INFO ][KnapsackPushAction ] index created:
testpu
[2014-10-19 01:11:04,374][INFO ][KnapsackPushAction ] getting
aliases for index test
[2014-10-19 01:11:04,374][INFO ][KnapsackPushAction ] found 0 aliases
[2014-10-19 01:11:04,375][INFO ][BulkTransportClient ] flushing bulk
processor
[2014-10-19 01:11:04,376][INFO ][BulkTransportClient ] before bulk
[1] [actions=3] [bytes=404] [concurrent requests
=0]
[2014-10-19 01:11:04,418][INFO ][BulkTransportClient ] after bulk [1]
[succeeded=3] [failed=0] [41ms] [concurrent r
equests=0]
[2014-10-19 01:11:04,419][INFO ][BulkTransportClient ] closing bulk
processor...
[2014-10-19 01:11:04,419][INFO ][BulkTransportClient ] shutting
down...
[2014-10-19 01:11:04,427][INFO ][BulkTransportClient ] shutting down
completed
[2014-10-19 01:11:04,427][INFO ][KnapsackPushAction ] end of push:
{"mode":"push","started":"2014-10-19T00:11:04.2
59Z","node_name":"Logan"}, count = 3
[2014-10-19 01:11:04,428][INFO ][KnapsackService ] remove:
plugin.knapsack.export.state -> [{"mode":"push","sta
rted":"2014-10-19T00:11:04.259Z","node_name":"Logan"}]
[2014-10-19 01:11:04,428][INFO ][KnapsackService ] update cluster
settings: plugin.knapsack.export.state ->
But seems that no matter what value I put for test_doc_mapping it doesn't
find the mapping file?
It creates testpu with same mappings as test and copies the test data into
testpu index.
If I try this command:
curl -XPOST
'localhost:9200/test/_push?map={"test":"testpu"}&test_doc_mapping=/tmp/testpu_doc_mapping.json'
curl doesn't like it:
{"error":"JsonParseException[Unexpected character ('"' (code 34)): was
expecting either '*' or '/' for a comment\n at [Source:
/"test":"testpu"/; line: 1,
I finally decided to do this:
curl -XPOST 'localhost:9200/test/_push?map={"test":"testpu"}' -d
'test_doc_mapping=/tmp/testpu_doc_mapping.json'
which might have been better but the server said:
[2014-10-19 01:52:08,241][INFO ][BaseTransportClient ] connected
nodes = [[Logan][-4NzM7wxQ6S8IEK-aOST1Q][Paul][in
t[/192.168.43.250:9300]],
[#transport#-1][Paul][inet[localhost/127.0.0.1:9300]]]
[2014-10-19 01:52:08,241][INFO ][BaseTransportClient ] new connection
to [Logan][-4NzM7wxQ6S8IEK-aOST1Q][Paul][ine
[/192.168.43.250:9300]]
[2014-10-19 01:52:08,242][INFO ][BaseTransportClient ] new connection
to [#transport#-1][Paul][inet[localhost/127.
.0.1:9300]]
[2014-10-19 01:52:08,242][INFO ][BaseTransportClient ] trying to
discover more nodes...
[2014-10-19 01:52:08,247][INFO ][BaseTransportClient ] adding
discovered node [Logan][-4NzM7wxQ6S8IEK-aOST1Q][Paul
[inet[/192.168.43.250:9300]]
[2014-10-19 01:52:08,251][INFO ][BaseTransportClient ] ... discovery
done
[2014-10-19 01:52:08,252][INFO ][KnapsackService ] add:
plugin.knapsack.export.state ->
[2014-10-19 01:52:08,252][INFO ][KnapsackPushAction ] start of push:
{"mode":"push","started":"2014-10-19T00:52:0
.252Z","node_name":"Logan"}
[2014-10-19 01:52:08,253][INFO ][KnapsackService ] update cluster
settings: plugin.knapsack.export.state -> [{
mode":"push","started":"2014-10-19T00:52:08.252Z","node_name":"Logan"}]
[2014-10-19 01:52:08,253][INFO ][KnapsackPushAction ]
map={test=testpu}
[2014-10-19 01:52:08,254][INFO ][KnapsackPushAction ] getting
settings for indices [test]
[2014-10-19 01:52:08,255][INFO ][KnapsackPushAction ] found indices:
[test]
[2014-10-19 01:52:08,255][INFO ][KnapsackPushAction ] getting
mappings for index test and types
[2014-10-19 01:52:08,256][INFO ][KnapsackPushAction ] found
mappings: [default, doc]
[2014-10-19 01:52:08,256][INFO ][KnapsackPushAction ] adding
mapping: default
[2014-10-19 01:52:08,257][INFO ][KnapsackPushAction ] adding
mapping: doc
[2014-10-19 01:52:08,257][INFO ][KnapsackPushAction ] creating
index: testpu
[2014-10-19 01:52:08,286][INFO ][cluster.metadata ] [Logan]
[testpu] creating index, cause [api], shards [5]/[1
, mappings [default, doc]
[2014-10-19 01:52:08,362][INFO ][KnapsackPushAction ] index created:
testpu
[2014-10-19 01:52:08,362][INFO ][KnapsackPushAction ] getting
aliases for index test
[2014-10-19 01:52:08,362][INFO ][KnapsackPushAction ] found 0 aliases
[2014-10-19 01:52:08,363][DEBUG][action.search.type ] [Logan]
[test][2], node[-4NzM7wxQ6S8IEK-aOST1Q], [P], s[STA
TED]: Failed to execute
[org.elasticsearch.action.search.SearchRequest@2a6319a4] lastShard [true]
org.elasticsearch.search.SearchParseException: [test][2]:
from[-1],size[-1]: Parse Failure [Failed to parse source [_na
]]
at
org.elasticsearch.search.SearchService.parseSource(SearchService.java:660)
at
org.elasticsearch.search.SearchService.createContext(SearchService.java:516)
at
org.elasticsearch.search.SearchService.createAndPutContext(SearchService.java:488)
at
org.elasticsearch.search.SearchService.executeScan(SearchService.java:207)
at
org.elasticsearch.search.action.SearchServiceTransportAction$19.call(SearchServiceTransportAction.java:444)
at
org.elasticsearch.search.action.SearchServiceTransportAction$19.call(SearchServiceTransportAction.java:441)
at
org.elasticsearch.search.action.SearchServiceTransportAction$23.run(SearchServiceTransportAction.java:517)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:724)
Caused by: org.elasticsearch.ElasticsearchParseException: Failed to derive
xcontent from org.elasticsearch.common.bytes
ChannelBufferBytesReference@83ae6e1b
at
org.elasticsearch.common.xcontent.XContentFactory.xContent(XContentFactory.java:259)
at
org.elasticsearch.search.SearchService.parseSource(SearchService.java:630)
... 9 more
[2014-10-19 01:52:08,363][DEBUG][action.search.type ] [Logan]
[test][4], node[-4NzM7wxQ6S8IEK-aOST1Q], [P], s[STA
TED]: Failed to execute
[org.elasticsearch.action.search.SearchRequest@2a6319a4] lastShard [true]
org.elasticsearch.search.SearchParseException: [test][4]:
from[-1],size[-1]: Parse Failure [Failed to parse source [_na
]]
at
org.elasticsearch.search.SearchService.parseSource(SearchService.java:660)
at
org.elasticsearch.search.SearchService.createContext(SearchService.java:516)
at
org.elasticsearch.search.SearchService.createAndPutContext(SearchService.java:488)
at
org.elasticsearch.search.SearchService.executeScan(SearchService.java:207)
at
org.elasticsearch.search.action.SearchServiceTransportAction$19.call(SearchServiceTransportAction.java:444)
at
org.elasticsearch.search.action.SearchServiceTransportAction$19.call(SearchServiceTransportAction.java:441)
at
org.elasticsearch.search.action.SearchServiceTransportAction$23.run(SearchServiceTransportAction.java:517)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:724)
Caused by: org.elasticsearch.ElasticsearchParseException: Failed to derive
xcontent from org.elasticsearch.common.bytes
ChannelBufferBytesReference@83ae6e1b
at
org.elasticsearch.common.xcontent.XContentFactory.xContent(XContentFactory.java:259)
at
org.elasticsearch.search.SearchService.parseSource(SearchService.java:630)
... 9 more
[2014-10-19 01:52:08,363][DEBUG][action.search.type ] [Logan]
[test][3], node[-4NzM7wxQ6S8IEK-aOST1Q], [P], s[STA
TED]: Failed to execute
[org.elasticsearch.action.search.SearchRequest@2a6319a4] lastShard [true]
org.elasticsearch.search.SearchParseException: [test][3]:
from[-1],size[-1]: Parse Failure [Failed to parse source [_na
]]
at
org.elasticsearch.search.SearchService.parseSource(SearchService.java:660)
at
org.elasticsearch.search.SearchService.createContext(SearchService.java:516)
at
org.elasticsearch.search.SearchService.createAndPutContext(SearchService.java:488)
at
org.elasticsearch.search.SearchService.executeScan(SearchService.java:207)
at
org.elasticsearch.search.action.SearchServiceTransportAction$19.call(SearchServiceTransportAction.java:444)
at
org.elasticsearch.search.action.SearchServiceTransportAction$19.call(SearchServiceTransportAction.java:441)
at
org.elasticsearch.search.action.SearchServiceTransportAction$23.run(SearchServiceTransportAction.java:517)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:724)
Caused by: org.elasticsearch.ElasticsearchParseException: Failed to derive
xcontent from org.elasticsearch.common.bytes
ChannelBufferBytesReference@83ae6e1b
at
org.elasticsearch.common.xcontent.XContentFactory.xContent(XContentFactory.java:259)
at
org.elasticsearch.search.SearchService.parseSource(SearchService.java:630)
... 9 more
[2014-10-19 01:52:08,364][DEBUG][action.search.type ] [Logan]
[test][0], node[-4NzM7wxQ6S8IEK-aOST1Q], [P], s[STA
TED]: Failed to execute
[org.elasticsearch.action.search.SearchRequest@2a6319a4]
org.elasticsearch.search.SearchParseException: [test][0]:
from[-1],size[-1]: Parse Failure [Failed to parse source [na
]]
at
org.elasticsearch.search.SearchService.parseSource(SearchService.java:660)
at
org.elasticsearch.search.SearchService.createContext(SearchService.java:516)
at
org.elasticsearch.search.SearchService.createAndPutContext(SearchService.java:488)
at
org.elasticsearch.search.SearchService.executeScan(SearchService.java:207)
at
org.elasticsearch.search.action.SearchServiceTransportAction$19.call(SearchServiceTransportAction.java:444)
at
org.elasticsearch.search.action.SearchServiceTransportAction$19.call(SearchServiceTransportAction.java:441)
at
org.elasticsearch.search.action.SearchServiceTransportAction$23.run(SearchServiceTransportAction.java:517)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:724)
Caused by: org.elasticsearch.ElasticsearchParseException: Failed to derive
xcontent from org.elasticsearch.common.bytes
ChannelBufferBytesReference@83ae6e1b
at
org.elasticsearch.common.xcontent.XContentFactory.xContent(XContentFactory.java:259)
at
org.elasticsearch.search.SearchService.parseSource(SearchService.java:630)
... 9 more
[2014-10-19 01:52:08,364][DEBUG][action.search.type ] [Logan]
[test][1], node[-4NzM7wxQ6S8IEK-aOST1Q], [P], s[STA
TED]: Failed to execute
[org.elasticsearch.action.search.SearchRequest@2a6319a4] lastShard [true]
org.elasticsearch.search.SearchParseException: [test][1]:
from[-1],size[-1]: Parse Failure [Failed to parse source [na
]]
at
org.elasticsearch.search.SearchService.parseSource(SearchService.java:660)
at
org.elasticsearch.search.SearchService.createContext(SearchService.java:516)
at
org.elasticsearch.search.SearchService.createAndPutContext(SearchService.java:488)
at
org.elasticsearch.search.SearchService.executeScan(SearchService.java:207)
at
org.elasticsearch.search.action.SearchServiceTransportAction$19.call(SearchServiceTransportAction.java:444)
at
org.elasticsearch.search.action.SearchServiceTransportAction$19.call(SearchServiceTransportAction.java:441)
at
org.elasticsearch.search.action.SearchServiceTransportAction$23.run(SearchServiceTransportAction.java:517)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:724)
Caused by: org.elasticsearch.ElasticsearchParseException: Failed to derive
xcontent from org.elasticsearch.common.bytes
ChannelBufferBytesReference@83ae6e1b
at
org.elasticsearch.common.xcontent.XContentFactory.xContent(XContentFactory.java:259)
at
org.elasticsearch.search.SearchService.parseSource(SearchService.java:630)
... 9 more
[2014-10-19 01:52:08,376][DEBUG][action.search.type ] [Logan] All
shards failed for phase: [init_scan]
[2014-10-19 01:52:08,385][ERROR][KnapsackPushAction ] Failed to
execute phase [init_scan], all shards failed; sha
dFailures {[-4NzM7wxQ6S8IEK-aOST1Q][test][0]:
SearchParseException[[test][0]: from[-1],size[-1]: Parse Failure [Failed
o parse source [na]]]; nested: ElasticsearchParseException[Failed to
derive xcontent from org.elasticsearch.common.by
es.ChannelBufferBytesReference@83ae6e1b];
}{[-4NzM7wxQ6S8IEK-aOST1Q][test][1]: SearchParseException[[test][1]:
from[-1]
size[-1]: Parse Failure [Failed to parse source [na]]]; nested:
ElasticsearchParseException[Failed to derive xcontent
from org.elasticsearch.common.bytes.ChannelBufferBytesReference@83ae6e1b];
}{[-4NzM7wxQ6S8IEK-aOST1Q][test][2]: SearchP
rseException[[test][2]: from[-1],size[-1]: Parse Failure [Failed to parse
source [na]]]; nested: ElasticsearchParseEx
eption[Failed to derive xcontent from
org.elasticsearch.common.bytes.ChannelBufferBytesReference@83ae6e1b];
}{[-4NzM7wx
6S8IEK-aOST1Q][test][3]: SearchParseException[[test][3]:
from[-1],size[-1]: Parse Failure [Failed to parse source [na
]]; nested: ElasticsearchParseException[Failed to derive xcontent from
org.elasticsearch.common.bytes.ChannelBufferByte
Reference@83ae6e1b]; }{[-4NzM7wxQ6S8IEK-aOST1Q][test][4]:
SearchParseException[[test][4]: from[-1],size[-1]: Parse Fail
re [Failed to parse source [na]]]; nested:
ElasticsearchParseException[Failed to derive xcontent from org.elasticsear
h.common.bytes.ChannelBufferBytesReference@83ae6e1b]; }
org.elasticsearch.action.search.SearchPhaseExecutionException: Failed to
execute phase [init_scan], all shards failed;
hardFailures {[-4NzM7wxQ6S8IEK-aOST1Q][test][0]:
SearchParseException[[test][0]: from[-1],size[-1]: Parse Failure [Fail
d to parse source [na]]]; nested: ElasticsearchParseException[Failed to
derive xcontent from org.elasticsearch.common
bytes.ChannelBufferBytesReference@83ae6e1b];
}{[-4NzM7wxQ6S8IEK-aOST1Q][test][1]: SearchParseException[[test][1]: from[
1],size[-1]: Parse Failure [Failed to parse source [na]]]; nested:
ElasticsearchParseException[Failed to derive xcont
nt from
org.elasticsearch.common.bytes.ChannelBufferBytesReference@83ae6e1b];
}{[-4NzM7wxQ6S8IEK-aOST1Q][test][2]: Sear
hParseException[[test][2]: from[-1],size[-1]: Parse Failure [Failed to
parse source [na]]]; nested: ElasticsearchPars
Exception[Failed to derive xcontent from
org.elasticsearch.common.bytes.ChannelBufferBytesReference@83ae6e1b];
}{[-4NzM
wxQ6S8IEK-aOST1Q][test][3]: SearchParseException[[test][3]:
from[-1],size[-1]: Parse Failure [Failed to parse source [
a]]]; nested: ElasticsearchParseException[Failed to derive xcontent from
org.elasticsearch.common.bytes.ChannelBufferB
tesReference@83ae6e1b]; }{[-4NzM7wxQ6S8IEK-aOST1Q][test][4]:
SearchParseException[[test][4]: from[-1],size[-1]: Parse F
ilure [Failed to parse source [na]]]; nested:
ElasticsearchParseException[Failed to derive xcontent from org.elastics
arch.common.bytes.ChannelBufferBytesReference@83ae6e1b]; }
at
org.elasticsearch.action.search.type.TransportSearchTypeAction$BaseAsyncAction.onFirstPhaseResult(TransportS
archTypeAction.java:233)
at
org.elasticsearch.action.search.type.TransportSearchTypeAction$BaseAsyncAction$1.onFailure(TransportSearchTy
eAction.java:179)
at
org.elasticsearch.search.action.SearchServiceTransportAction$23.run(SearchServiceTransportAction.java:523)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:724)
[2014-10-19 01:52:08,396][INFO ][KnapsackService ] remove:
plugin.knapsack.export.state -> [{"mode":"push","st
rted":"2014-10-19T00:52:08.252Z","node_name":"Logan"}]
[2014-10-19 01:52:08,399][INFO ][KnapsackService ] update cluster
settings: plugin.knapsack.export.state ->
Any pointers please.
Secondly, I had difficulty getting the above command to work in "sense" . I
would find it easier if I could use "sense" to issue commands.
Thirdly, while this is what I want: is there a more full-featured,
operations-ready, GUI based tool with the same functionality?
I appreciate your help.
Regards.
On Friday, October 17, 2014 4:10:11 PM UTC+1, Jörg Prante wrote:
You can use the knapsack plugin for export/import data and change mappings
(and much more!)
For a 1:1 online copy, just one curl command is necessary, yes.
GitHub - jprante/elasticsearch-knapsack: Knapsack plugin is an import/export tool for Elasticsearch
Jörg
On Thu, Oct 16, 2014 at 7:55 PM, <eune...@gmail.com <javascript:>> wrote:
Hi
I can see there are lots of utilities to copy the contents of an index
such as
elasticdump
reindexer
streames
etc
And they mostly use scan scroll.
Is there a single curl command to copy an index to a new index?
Without too much investigation it looks like scan scroll requires
repeated calls?
Can you please confirm?
If this is the case what is the simplest supported utility?
Alternatively is there a plugin with front end to choose from and to
index?
Thanks in advance
--
You received this message because you are subscribed to the Google Groups
"elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an
email to elasticsearc...@googlegroups.com <javascript:>.
To view this discussion on the web visit
https://groups.google.com/d/msgid/elasticsearch/1caeebf5-44de-4eba-ad5a-c702461bf3d2%40googlegroups.com
.
For more options, visit https://groups.google.com/d/optout.
--
You received this message because you are subscribed to the Google Groups "elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email to elasticsearch+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/b9c20772-f9a2-4a2f-aae1-8596961eb84c%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.