ElasticSearch hadoop - .EsHadoopSerializationException

Hi guys,

I am trying to run a MR job that reads from HDFS and stores into
ElasticSearch cluster.

I am getting following error:
Error:
org.elasticsearch.hadoop.serialization.EsHadoopSerializationException:
Cannot handle type [class org.apache.hadoop.io.MapWritable], instance
[org.apache.hadoop.io.MapWritable@3879429f] using writer
[org.elasticsearch.hadoop.mr.WritableValueWriter@3fc8f1a2]
at
org.elasticsearch.hadoop.serialization.builder.ContentBuilder.value(ContentBuilder.java:259)
at
org.elasticsearch.hadoop.serialization.bulk.TemplatedBulk.doWriteObject(TemplatedBulk.java:68)
at
org.elasticsearch.hadoop.serialization.bulk.TemplatedBulk.write(TemplatedBulk.java:55)
at
org.elasticsearch.hadoop.rest.RestRepository.writeToIndex(RestRepository.java:130)
at
org.elasticsearch.hadoop.mr.EsOutputFormat$EsRecordWriter.write(EsOutputFormat.java:159)
at
org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:635)
at
org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89)
at
org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112)
at
com.teradata.cybershot.mr.es.userprofile.EsOnlineProfileMapper.map(EsOnlineProfileMapper.java:35)
at
com.teradata.cybershot.mr.es.userprofile.EsOnlineProfileMapper.map(EsOnlineProfileMapper.java:20)
at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145)
at
org.apache.hadoop.mapreduce.lib.input.DelegatingMapper.run(DelegatingMapper.java:55)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:340)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:167)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1554)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)

We are using cdh5.1.0 and es-hadoop dependency 2.0.2

I have this set in my job configuration:
job.setOutputFormatClass(EsOutputFormat.class);
job.setMapOutputValueClass(MapWritable.class);

together with nodes and resource props like it is described on ES page.

in my mapper I simply write: context.write(NullWritable.get(), esMap);
where esMap is org.apache.hadoop.io.MapWritable.

I do not know why it's failing as everything looks ok to me. Maybe you will
have some ideas.

Thanks in advance,
Kamil.

--
You received this message because you are subscribed to the Google Groups "elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email to elasticsearch+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/71c57e2a-2210-47c0-aa9e-cbbf164ef05b%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Hi,

This error is typically tied to a classpath issue - make sure you have only one elasticsearch-hadoop jar version in your
classpath and on the Hadoop cluster.

On 12/12/14 5:56 PM, Kamil Dziublinski wrote:

Hi guys,

I am trying to run a MR job that reads from HDFS and stores into Elasticsearch cluster.

I am getting following error:
Error: org.elasticsearch.hadoop.serialization.EsHadoopSerializationException: Cannot handle type [class
org.apache.hadoop.io.MapWritable], instance [org.apache.hadoop.io.MapWritable@3879429f] using writer
[org.elasticsearch.hadoop.mr.WritableValueWriter@3fc8f1a2]
at org.elasticsearch.hadoop.serialization.builder.ContentBuilder.value(ContentBuilder.java:259)
at org.elasticsearch.hadoop.serialization.bulk.TemplatedBulk.doWriteObject(TemplatedBulk.java:68)
at org.elasticsearch.hadoop.serialization.bulk.TemplatedBulk.write(TemplatedBulk.java:55)
at org.elasticsearch.hadoop.rest.RestRepository.writeToIndex(RestRepository.java:130)
at org.elasticsearch.hadoop.mr.EsOutputFormat$EsRecordWriter.write(EsOutputFormat.java:159)
at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:635)
at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89)
at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112)
at com.teradata.cybershot.mr.es.userprofile.EsOnlineProfileMapper.map(EsOnlineProfileMapper.java:35)
at com.teradata.cybershot.mr.es.userprofile.EsOnlineProfileMapper.map(EsOnlineProfileMapper.java:20)
at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145)
at org.apache.hadoop.mapreduce.lib.input.DelegatingMapper.run(DelegatingMapper.java:55)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:340)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:167)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1554)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)

We are using cdh5.1.0 and es-hadoop dependency 2.0.2

I have this set in my job configuration:
job.setOutputFormatClass(EsOutputFormat.class);
job.setMapOutputValueClass(MapWritable.class);

together with nodes and resource props like it is described on ES page.

in my mapper I simply write: context.write(NullWritable.get(), esMap); where esMap is org.apache.hadoop.io.MapWritable.

I do not know why it's failing as everything looks ok to me. Maybe you will have some ideas.

Thanks in advance,
Kamil.

--
You received this message because you are subscribed to the Google Groups "elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email to
elasticsearch+unsubscribe@googlegroups.com mailto:elasticsearch+unsubscribe@googlegroups.com.
To view this discussion on the web visit
https://groups.google.com/d/msgid/elasticsearch/71c57e2a-2210-47c0-aa9e-cbbf164ef05b%40googlegroups.com
https://groups.google.com/d/msgid/elasticsearch/71c57e2a-2210-47c0-aa9e-cbbf164ef05b%40googlegroups.com?utm_medium=email&utm_source=footer.
For more options, visit https://groups.google.com/d/optout.

--
Costin

--
You received this message because you are subscribed to the Google Groups "elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email to elasticsearch+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/548B43CD.8080909%40gmail.com.
For more options, visit https://groups.google.com/d/optout.

Hi,

I had only one jar on classpath and none in hadoop cluster.
I had different types of values in my MapWritable tho. It turns out this
was the problem.
So I had always Text as a key, but depending on type Text, LongWritable,
BooleanWritable or DoubleWritable as value in that map.
When I changed everything to be Text it started working.

Is this intended behaviour?

Cheers,
Kamil.

On Friday, December 12, 2014 8:37:03 PM UTC+1, Costin Leau wrote:

Hi,

This error is typically tied to a classpath issue - make sure you have
only one elasticsearch-hadoop jar version in your
classpath and on the Hadoop cluster.

On 12/12/14 5:56 PM, Kamil Dziublinski wrote:

Hi guys,

I am trying to run a MR job that reads from HDFS and stores into
Elasticsearch cluster.

I am getting following error:
Error:
org.elasticsearch.hadoop.serialization.EsHadoopSerializationException:
Cannot handle type [class
org.apache.hadoop.io.MapWritable], instance
[org.apache.hadoop.io.MapWritable@3879429f] using writer
[org.elasticsearch.hadoop.mr.WritableValueWriter@3fc8f1a2]
at
org.elasticsearch.hadoop.serialization.builder.ContentBuilder.value(ContentBuilder.java:259)

     at 

org.elasticsearch.hadoop.serialization.bulk.TemplatedBulk.doWriteObject(TemplatedBulk.java:68)

     at 

org.elasticsearch.hadoop.serialization.bulk.TemplatedBulk.write(TemplatedBulk.java:55)

     at 

org.elasticsearch.hadoop.rest.RestRepository.writeToIndex(RestRepository.java:130)

     at 

org.elasticsearch.hadoop.mr.EsOutputFormat$EsRecordWriter.write(EsOutputFormat.java:159)

     at 

org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:635)

     at 

org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89)

     at 

org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112)

     at 

com.teradata.cybershot.mr.es.userprofile.EsOnlineProfileMapper.map(EsOnlineProfileMapper.java:35)

     at 

com.teradata.cybershot.mr.es.userprofile.EsOnlineProfileMapper.map(EsOnlineProfileMapper.java:20)

     at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) 
     at 

org.apache.hadoop.mapreduce.lib.input.DelegatingMapper.run(DelegatingMapper.java:55)

     at 

org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)

     at org.apache.hadoop.mapred.MapTask.run(MapTask.java:340) 
     at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:167) 
     at java.security.AccessController.doPrivileged(Native Method) 
     at javax.security.auth.Subject.doAs(Subject.java:415) 
     at 

org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1554)

     at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162) 

We are using cdh5.1.0 and es-hadoop dependency 2.0.2

I have this set in my job configuration:
job.setOutputFormatClass(EsOutputFormat.class);
job.setMapOutputValueClass(MapWritable.class);

together with nodes and resource props like it is described on ES page.

in my mapper I simply write: context.write(NullWritable.get(), esMap);
where esMap is org.apache.hadoop.io.MapWritable.

I do not know why it's failing as everything looks ok to me. Maybe you
will have some ideas.

Thanks in advance,
Kamil.

--
You received this message because you are subscribed to the Google
Groups "elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send
an email to
elasticsearc...@googlegroups.com <javascript:> <mailto:
elasticsearch+unsubscribe@googlegroups.com <javascript:>>.
To view this discussion on the web visit

https://groups.google.com/d/msgid/elasticsearch/71c57e2a-2210-47c0-aa9e-cbbf164ef05b%40googlegroups.com

<
https://groups.google.com/d/msgid/elasticsearch/71c57e2a-2210-47c0-aa9e-cbbf164ef05b%40googlegroups.com?utm_medium=email&utm_source=footer>.

For more options, visit https://groups.google.com/d/optout.

--
Costin

--
You received this message because you are subscribed to the Google Groups "elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email to elasticsearch+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/47200ca2-efd7-4741-832d-89c8b9ec088f%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Having multiple types shouldn't be an issue - ES is a document store so it's pretty common to have different types.
In other words, this is not the intended behavior - can you please create a small sample/snippet that reproduces the error
and raise an issue for it [1] ?

Thanks!

[1] Elasticsearch Platform — Find real-time answers at scale | Elastic

On 12/15/14 6:03 PM, Kamil Dziublinski wrote:

Hi,

I had only one jar on classpath and none in hadoop cluster.
I had different types of values in my MapWritable tho. It turns out this was the problem.
So I had always Text as a key, but depending on type Text, LongWritable, BooleanWritable or DoubleWritable as value in
that map.
When I changed everything to be Text it started working.

Is this intended behaviour?

Cheers,
Kamil.

On Friday, December 12, 2014 8:37:03 PM UTC+1, Costin Leau wrote:

Hi,

This error is typically tied to a classpath issue - make sure you have only one elasticsearch-hadoop jar version in
your
classpath and on the Hadoop cluster.

On 12/12/14 5:56 PM, Kamil Dziublinski wrote:
> Hi guys,
>
> I am trying to run a MR job that reads from HDFS and stores into ElasticSearch cluster.
>
> I am getting following error:
> Error: org.elasticsearch.hadoop.serialization.EsHadoopSerializationException: Cannot handle type [class
> org.apache.hadoop.io.MapWritable], instance [org.apache.hadoop.io.MapWritable@3879429f] using writer
> [org.elasticsearch.hadoop.mr.WritableValueWriter@3fc8f1a2]
>          at org.elasticsearch.hadoop.serialization.builder.ContentBuilder.value(ContentBuilder.java:259)
>          at org.elasticsearch.hadoop.serialization.bulk.TemplatedBulk.doWriteObject(TemplatedBulk.java:68)
>          at org.elasticsearch.hadoop.serialization.bulk.TemplatedBulk.write(TemplatedBulk.java:55)
>          at org.elasticsearch.hadoop.rest.RestRepository.writeToIndex(RestRepository.java:130)
>          at org.elasticsearch.hadoop.mr.EsOutputFormat$EsRecordWriter.write(EsOutputFormat.java:159)
>          at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:635)
>          at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89)
>          at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112)
>          at com.teradata.cybershot.mr.es.userprofile.EsOnlineProfileMapper.map(EsOnlineProfileMapper.java:35)
>          at com.teradata.cybershot.mr.es.userprofile.EsOnlineProfileMapper.map(EsOnlineProfileMapper.java:20)
>          at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145)
>          at org.apache.hadoop.mapreduce.lib.input.DelegatingMapper.run(DelegatingMapper.java:55)
>          at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
>          at org.apache.hadoop.mapred.MapTask.run(MapTask.java:340)
>          at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:167)
>          at java.security.AccessController.doPrivileged(Native Method)
>          at javax.security.auth.Subject.doAs(Subject.java:415)
>          at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1554)
>          at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
>
> We are using cdh5.1.0 and es-hadoop dependency 2.0.2
>
> I have this set in my job configuration:
> job.setOutputFormatClass(EsOutputFormat.class);
> job.setMapOutputValueClass(MapWritable.class);
>
> together with nodes and resource props like it is described on ES page.
>
> in my mapper I simply write: context.write(NullWritable.get(), esMap); where esMap is org.apache.hadoop.io.MapWritable.
>
> I do not know why it's failing as everything looks ok to me. Maybe you will have some ideas.
>
> Thanks in advance,
> Kamil.
>
> --
> You received this message because you are subscribed to the Google Groups "elasticsearch" group.
> To unsubscribe from this group and stop receiving emails from it, send an email to
>elasticsearc...@googlegroups.com <javascript:> <mailto:elasticsearch+unsubscribe@googlegroups.com <javascript:>>.
> To view this discussion on the web visit
>https://groups.google.com/d/msgid/elasticsearch/71c57e2a-2210-47c0-aa9e-cbbf164ef05b%40googlegroups.com
<https://groups.google.com/d/msgid/elasticsearch/71c57e2a-2210-47c0-aa9e-cbbf164ef05b%40googlegroups.com>
> <https://groups.google.com/d/msgid/elasticsearch/71c57e2a-2210-47c0-aa9e-cbbf164ef05b%40googlegroups.com?utm_medium=email&utm_source=footer
<https://groups.google.com/d/msgid/elasticsearch/71c57e2a-2210-47c0-aa9e-cbbf164ef05b%40googlegroups.com?utm_medium=email&utm_source=footer>>.

> For more options, visithttps://groups.google.com/d/optout <https://groups.google.com/d/optout>.

--
Costin

--
You received this message because you are subscribed to the Google Groups "elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email to
elasticsearch+unsubscribe@googlegroups.com mailto:elasticsearch+unsubscribe@googlegroups.com.
To view this discussion on the web visit
https://groups.google.com/d/msgid/elasticsearch/47200ca2-efd7-4741-832d-89c8b9ec088f%40googlegroups.com
https://groups.google.com/d/msgid/elasticsearch/47200ca2-efd7-4741-832d-89c8b9ec088f%40googlegroups.com?utm_medium=email&utm_source=footer.
For more options, visit https://groups.google.com/d/optout.

--
Costin

--
You received this message because you are subscribed to the Google Groups "elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email to elasticsearch+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/549019F9.7040100%40gmail.com.
For more options, visit https://groups.google.com/d/optout.

Hi Costin,

I didnt find any Jira or anything for issues on that help page.
Anyway I created two small java classes. One for MR job and one IT test to
run that job and get the exception (to not bother from cmd line).
Plus some dummy input file just to get inside the mapper.

I attached them. If you need some more info let me know.

Dummy input file was placed in src/test/resources/input/input.txt
for test to read it.

I tested this with gradle project (within my existing one).
elasticsearch-hadoop 2.0.2 dependency and java 7.

You can see the exception being thrown in console when running it from IDE
as junit test.

Cheers,
Kamil.

On Tue, Dec 16, 2014 at 12:39 PM, Costin Leau costin.leau@gmail.com wrote:

Having multiple types shouldn't be an issue - ES is a document store so
it's pretty common to have different types.
In other words, this is not the intended behavior - can you please create
a small sample/snippet that reproduces the error
and raise an issue for it [1] ?

Thanks!

[1] Elasticsearch Platform — Find real-time answers at scale | Elastic
master/troubleshooting.html

On 12/15/14 6:03 PM, Kamil Dziublinski wrote:

Hi,

I had only one jar on classpath and none in hadoop cluster.
I had different types of values in my MapWritable tho. It turns out this
was the problem.
So I had always Text as a key, but depending on type Text, LongWritable,
BooleanWritable or DoubleWritable as value in
that map.
When I changed everything to be Text it started working.

Is this intended behaviour?

Cheers,
Kamil.

On Friday, December 12, 2014 8:37:03 PM UTC+1, Costin Leau wrote:

Hi,

This error is typically tied to a classpath issue - make sure you

have only one elasticsearch-hadoop jar version in
your
classpath and on the Hadoop cluster.

On 12/12/14 5:56 PM, Kamil Dziublinski wrote:
> Hi guys,
>
> I am trying to run a MR job that reads from HDFS and stores into

Elasticsearch cluster.
>
> I am getting following error:
> Error: org.elasticsearch.hadoop.serialization.
EsHadoopSerializationException: Cannot handle type [class
> org.apache.hadoop.io.MapWritable], instance [org.apache.hadoop.io.
MapWritable@3879429f] using writer
> [org.elasticsearch.hadoop.mr.WritableValueWriter@3fc8f1a2]
> at org.elasticsearch.hadoop.serialization.builder.
ContentBuilder.value(ContentBuilder.java:259)
> at org.elasticsearch.hadoop.serialization.bulk.
TemplatedBulk.doWriteObject(TemplatedBulk.java:68)
> at org.elasticsearch.hadoop.serialization.bulk.
TemplatedBulk.write(TemplatedBulk.java:55)
> at org.elasticsearch.hadoop.rest.
RestRepository.writeToIndex(RestRepository.java:130)
> at org.elasticsearch.hadoop.mr.
EsOutputFormat$EsRecordWriter.write(EsOutputFormat.java:159)
> at org.apache.hadoop.mapred.MapTask$
NewDirectOutputCollector.write(MapTask.java:635)
> at org.apache.hadoop.mapreduce.task.
TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89)
> at org.apache.hadoop.mapreduce.
lib.map.WrappedMapper$Context.write(WrappedMapper.java:112)
> at com.teradata.cybershot.mr.es.userprofile.
EsOnlineProfileMapper.map(EsOnlineProfileMapper.java:35)
> at com.teradata.cybershot.mr.es.userprofile.
EsOnlineProfileMapper.map(EsOnlineProfileMapper.java:20)
> at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145)
> at org.apache.hadoop.mapreduce.lib.input.DelegatingMapper.
run(DelegatingMapper.java:55)
> at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.
java:764)
> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:340)
> at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.
java:167)
> at java.security.AccessController.doPrivileged(Native
Method)
> at javax.security.auth.Subject.doAs(Subject.java:415)
> at org.apache.hadoop.security.UserGroupInformation.doAs(
UserGroupInformation.java:1554)
> at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:
162)
>
> We are using cdh5.1.0 and es-hadoop dependency 2.0.2
>
> I have this set in my job configuration:
> job.setOutputFormatClass(EsOutputFormat.class);
> job.setMapOutputValueClass(MapWritable.class);
>
> together with nodes and resource props like it is described on ES
page.
>
> in my mapper I simply write: context.write(NullWritable.get(),
esMap); where esMap is org.apache.hadoop.io.MapWritable.
>
> I do not know why it's failing as everything looks ok to me. Maybe
you will have some ideas.
>
> Thanks in advance,
> Kamil.
>
> --
> You received this message because you are subscribed to the Google
Groups "elasticsearch" group.
> To unsubscribe from this group and stop receiving emails from it,
send an email to
>elasticsearc...@googlegroups.com <javascript:> <mailto:
elasticsearch+unsubscribe@googlegroups.com <javascript:>>.
> To view this discussion on the web visit
>https://groups.google.com/d/msgid/elasticsearch/71c57e2a-
2210-47c0-aa9e-cbbf164ef05b%40googlegroups.com
<https://groups.google.com/d/msgid/elasticsearch/71c57e2a-
2210-47c0-aa9e-cbbf164ef05b%40googlegroups.com>
> <https://groups.google.com/d/msgid/elasticsearch/71c57e2a-
2210-47c0-aa9e-cbbf164ef05b%40GGGROUPS CASINO – Real Slot Casino for 10,000+ Senior Players
email&utm_source=footer
<https://groups.google.com/d/msgid/elasticsearch/71c57e2a-
2210-47c0-aa9e-cbbf164ef05b%40GGGROUPS CASINO – Real Slot Casino for 10,000+ Senior Players
email&utm_source=footer>>.

> For more options, visithttps://groups.google.com/d/optout <

https://groups.google.com/d/optout>.

--
Costin

--
You received this message because you are subscribed to the Google Groups
"elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an
email to
elasticsearch+unsubscribe@googlegroups.com <mailto:elasticsearch+
unsubscribe@googlegroups.com>.
To view this discussion on the web visit
https://groups.google.com/d/msgid/elasticsearch/47200ca2-
efd7-4741-832d-89c8b9ec088f%40googlegroups.com
<https://groups.google.com/d/msgid/elasticsearch/47200ca2-
efd7-4741-832d-89c8b9ec088f%40GGGROUPS CASINO – Real Slot Casino for 10,000+ Senior Players
email&utm_source=footer>.
For more options, visit https://groups.google.com/d/optout.

--
Costin

--
You received this message because you are subscribed to a topic in the
Google Groups "elasticsearch" group.
To unsubscribe from this topic, visit https://groups.google.com/d/
topic/elasticsearch/SdlubzrL0xU/unsubscribe.
To unsubscribe from this group and all its topics, send an email to
elasticsearch+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/
msgid/elasticsearch/549019F9.7040100%40gmail.com.

For more options, visit https://groups.google.com/d/optout.

--
You received this message because you are subscribed to the Google Groups "elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email to elasticsearch+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/CAJV0mRtN3rukpBDfiLETDYxVPNDoCOJyR4jyS9NempeFv23Htw%40mail.gmail.com.
For more options, visit https://groups.google.com/d/optout.