Elasticsearch with JSON-array, causing serialize -error

I'm using elasticsearch with mongodb -collection using elmongohttps://github.com/usesold/elmongo.
I have a collection (elasticsearch index's point of view json-array), that
contains for example field:
"random_point": [ 0.10007477086037397, 0 ]

That's most likely the reason I get this error, when trying to index my
collection.
[2014-04-20 16:48:51,228][DEBUG][action.bulk ] [Emma Frost] [
mediacontent-2014-04-20t16:48:44.116z][4] failed to execute bulk item (index
) index {[mediacontent-2014-04$

org.elasticsearch.index.mapper.MapperParsingException: object mapping [
random_point] trying to serialize a value with no field associated with it,current value
[0.1000747708603739$

    at org.elasticsearch.index.mapper.object.ObjectMapper.serializeValue

(ObjectMapper.java:595)

    at org.elasticsearch.index.mapper.object.ObjectMapper.parse(

ObjectMapper.java:467)

    at org.elasticsearch.index.mapper.object.ObjectMapper.serializeValue

(ObjectMapper.java:599)

    at org.elasticsearch.index.mapper.object.ObjectMapper.serializeArray

(ObjectMapper.java:587)

    at org.elasticsearch.index.mapper.object.ObjectMapper.parse(

ObjectMapper.java:459)

    at org.elasticsearch.index.mapper.DocumentMapper.parse(

DocumentMapper.java:506)

    at org.elasticsearch.index.mapper.DocumentMapper.parse(

DocumentMapper.java:450)

    at org.elasticsearch.index.shard.service.InternalIndexShard.

prepareIndex(InternalIndexShard.java:327)

    at org.elasticsearch.action.bulk.TransportShardBulkAction.

shardIndexOperation(TransportShardBulkAction.java:381)

    at org.elasticsearch.action.bulk.TransportShardBulkAction.

shardOperationOnPrimary(TransportShardBulkAction.java:155)

    at org.elasticsearch.action.support.replication.

TransportShardReplicationOperationAction$AsyncShardOperationAction.
performOnPrimary(TransportShardReplicationOperationAction$

    at org.elasticsearch.action.support.replication.

TransportShardReplicationOperationAction$AsyncShardOperationAction$1.run(
TransportShardReplicationOperationAction.java:430)

    at java.util.concurrent.ThreadPoolExecutor.runWorker(

ThreadPoolExecutor.java:1146)

    at java.util.concurrent.ThreadPoolExecutor$Worker.run(

ThreadPoolExecutor.java:615)

    at java.lang.Thread.run(Thread.java:701)

[2014-04-20 16:48:54,129][INFO ][cluster.metadata ] [Emma Frost] [
mediacontent-2014-04-20t16:39:09.348z] deleting index

Is there any ways to bypass this? That array is a needed value in my
collection. Is there anyways to give some option in elasticsearch to not to
index that JSON-field, tho it's not going to be searchable field at all?

Best regards,

PK

--
You received this message because you are subscribed to the Google Groups "elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email to elasticsearch+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/edc6d83b-6e88-4cbc-b5db-78bfd1de5a46%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Anyone have any suggenstions how to prevent this?

sunnuntai, 20. huhtikuuta 2014 20.47.40 UTC+3 PyrK kirjoitti:

I'm using elasticsearch with mongodb -collection using elmongohttps://github.com/usesold/elmongo.
I have a collection (elasticsearch index's point of view json-array), that
contains for example field:
"random_point": [ 0.10007477086037397, 0 ]

That's most likely the reason I get this error, when trying to index my
collection.
[2014-04-20 16:48:51,228][DEBUG][action.bulk ] [Emma Frost] [
mediacontent-2014-04-20t16:48:44.116z][4] failed to execute bulk item (
index) index {[mediacontent-2014-04$

org.elasticsearch.index.mapper.MapperParsingException: object mapping [
random_point] trying to serialize a value with no field associated with it
, current value [0.1000747708603739$

    at org.elasticsearch.index.mapper.object.ObjectMapper.

serializeValue(ObjectMapper.java:595)

    at org.elasticsearch.index.mapper.object.ObjectMapper.parse(

ObjectMapper.java:467)

    at org.elasticsearch.index.mapper.object.ObjectMapper.

serializeValue(ObjectMapper.java:599)

    at org.elasticsearch.index.mapper.object.ObjectMapper.

serializeArray(ObjectMapper.java:587)

    at org.elasticsearch.index.mapper.object.ObjectMapper.parse(

ObjectMapper.java:459)

    at org.elasticsearch.index.mapper.DocumentMapper.parse(

DocumentMapper.java:506)

    at org.elasticsearch.index.mapper.DocumentMapper.parse(

DocumentMapper.java:450)

    at org.elasticsearch.index.shard.service.InternalIndexShard.

prepareIndex(InternalIndexShard.java:327)

    at org.elasticsearch.action.bulk.TransportShardBulkAction.

shardIndexOperation(TransportShardBulkAction.java:381)

    at org.elasticsearch.action.bulk.TransportShardBulkAction.

shardOperationOnPrimary(TransportShardBulkAction.java:155)

    at org.elasticsearch.action.support.replication.

TransportShardReplicationOperationAction$AsyncShardOperationAction.
performOnPrimary(TransportShardReplicationOperationAction$

    at org.elasticsearch.action.support.replication.

TransportShardReplicationOperationAction$AsyncShardOperationAction$1.run(
TransportShardReplicationOperationAction.java:430)

    at java.util.concurrent.ThreadPoolExecutor.runWorker(

ThreadPoolExecutor.java:1146)

    at java.util.concurrent.ThreadPoolExecutor$Worker.run(

ThreadPoolExecutor.java:615)

    at java.lang.Thread.run(Thread.java:701)

[2014-04-20 16:48:54,129][INFO ][cluster.metadata ] [Emma Frost] [
mediacontent-2014-04-20t16:39:09.348z] deleting index

Is there any ways to bypass this? That array is a needed value in my
collection. Is there anyways to give some option in elasticsearch to not to
index that JSON-field, tho it's not going to be searchable field at all?

Best regards,

PK

--
You received this message because you are subscribed to the Google Groups "elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email to elasticsearch+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/79e3cdc6-be1f-4306-a855-048087d642d9%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

same issue here. Any clues?

On Sunday, April 20, 2014 at 2:47:40 PM UTC-3, PyrK wrote:

I'm using elasticsearch with mongodb -collection using elmongo
https://github.com/usesold/elmongo. I have a collection (elasticsearch
index's point of view json-array), that contains for example field:
"random_point": [ 0.10007477086037397, 0 ]

That's most likely the reason I get this error, when trying to index my
collection.
[2014-04-20 16:48:51,228][DEBUG][action.bulk ] [Emma Frost] [
mediacontent-2014-04-20t16:48:44.116z][4] failed to execute bulk item (
index) index {[mediacontent-2014-04$

org.elasticsearch.index.mapper.MapperParsingException: object mapping [
random_point] trying to serialize a value with no field associated with it
, current value [0.1000747708603739$

    at org.elasticsearch.index.mapper.object.ObjectMapper.

serializeValue(ObjectMapper.java:595)

    at org.elasticsearch.index.mapper.object.ObjectMapper.parse(

ObjectMapper.java:467)

    at org.elasticsearch.index.mapper.object.ObjectMapper.

serializeValue(ObjectMapper.java:599)

    at org.elasticsearch.index.mapper.object.ObjectMapper.

serializeArray(ObjectMapper.java:587)

    at org.elasticsearch.index.mapper.object.ObjectMapper.parse(

ObjectMapper.java:459)

    at org.elasticsearch.index.mapper.DocumentMapper.parse(

DocumentMapper.java:506)

    at org.elasticsearch.index.mapper.DocumentMapper.parse(

DocumentMapper.java:450)

    at org.elasticsearch.index.shard.service.InternalIndexShard.

prepareIndex(InternalIndexShard.java:327)

    at org.elasticsearch.action.bulk.TransportShardBulkAction.

shardIndexOperation(TransportShardBulkAction.java:381)

    at org.elasticsearch.action.bulk.TransportShardBulkAction.

shardOperationOnPrimary(TransportShardBulkAction.java:155)

    at org.elasticsearch.action.support.replication.

TransportShardReplicationOperationAction$AsyncShardOperationAction.
performOnPrimary(TransportShardReplicationOperationAction$

    at org.elasticsearch.action.support.replication.

TransportShardReplicationOperationAction$AsyncShardOperationAction$1.run(
TransportShardReplicationOperationAction.java:430)

    at java.util.concurrent.ThreadPoolExecutor.runWorker(

ThreadPoolExecutor.java:1146)

    at java.util.concurrent.ThreadPoolExecutor$Worker.run(

ThreadPoolExecutor.java:615)

    at java.lang.Thread.run(Thread.java:701)

[2014-04-20 16:48:54,129][INFO ][cluster.metadata ] [Emma Frost] [
mediacontent-2014-04-20t16:39:09.348z] deleting index

Is there any ways to bypass this? That array is a needed value in my
collection. Is there anyways to give some option in elasticsearch to not to
index that JSON-field, tho it's not going to be searchable field at all?

Best regards,

PK

--
You received this message because you are subscribed to the Google Groups "elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email to elasticsearch+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/bfd442a0-ef52-4ae9-be6c-5e98475cbff4%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Hi,

It looks like "random_point" is defined as object type but got array of numbers. There might be inconsistency in data or you didn't define mapping correctly.
You may want to apply correct mapping, "float" or "double".

Masaru

On March 25, 2015 at 05:07:27, sebastian (sebastianos@gmail.com) wrote:

same issue here. Any clues?

On Sunday, April 20, 2014 at 2:47:40 PM UTC-3, PyrK wrote:
I'm using elasticsearch with mongodb -collection using elmongo. I have a collection (elasticsearch index's point of view json-array), that contains for example field:

"random_point": [ 0.10007477086037397, 0 ]
That's most likely the reason I get this error, when trying to index my collection.

[2014-04-20 16:48:51,228][DEBUG][action.bulk ] [Emma Frost] [mediacontent-2014-04-20t16:48:44.116z][4] failed to execute bulk item (index) index {[mediacontent-2014-04$
org.elasticsearch.index.mapper.MapperParsingException: object mapping [random_point] trying to serialize a value with no field associated with it, current value [0.1000747708603739$

    at org.elasticsearch.index.mapper.object.ObjectMapper.serializeValue(ObjectMapper.java:595)


    at org.elasticsearch.index.mapper.object.ObjectMapper.parse(ObjectMapper.java:467)


    at org.elasticsearch.index.mapper.object.ObjectMapper.serializeValue(ObjectMapper.java:599)


    at org.elasticsearch.index.mapper.object.ObjectMapper.serializeArray(ObjectMapper.java:587)


    at org.elasticsearch.index.mapper.object.ObjectMapper.parse(ObjectMapper.java:459)


    at org.elasticsearch.index.mapper.DocumentMapper.parse(DocumentMapper.java:506)


    at org.elasticsearch.index.mapper.DocumentMapper.parse(DocumentMapper.java:450)


    at org.elasticsearch.index.shard.service.InternalIndexShard.prepareIndex(InternalIndexShard.java:327)


    at org.elasticsearch.action.bulk.TransportShardBulkAction.shardIndexOperation(TransportShardBulkAction.java:381)


    at org.elasticsearch.action.bulk.TransportShardBulkAction.shardOperationOnPrimary(TransportShardBulkAction.java:155)


    at org.elasticsearch.action.support.replication.TransportShardReplicationOperationAction$AsyncShardOperationAction.performOnPrimary(TransportShardReplicationOperationAction$


    at org.elasticsearch.action.support.replication.TransportShardReplicationOperationAction$AsyncShardOperationAction$1.run(TransportShardReplicationOperationAction.java:430)


    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1146)


    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)


    at java.lang.Thread.run(Thread.java:701)

[2014-04-20 16:48:54,129][INFO ][cluster.metadata ] [Emma Frost] [mediacontent-2014-04-20t16:39:09.348z] deleting index

Is there any ways to bypass this? That array is a needed value in my collection. Is there anyways to give some option in elasticsearch to not to index that JSON-field, tho it's not going to be searchable field at all?

Best regards,

PK

--
You received this message because you are subscribed to the Google Groups "elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email to elasticsearch+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/bfd442a0-ef52-4ae9-be6c-5e98475cbff4%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

--
You received this message because you are subscribed to the Google Groups "elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email to elasticsearch+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/etPan.5513c0d5.3d1b58ba.166%40citra-2.local.
For more options, visit https://groups.google.com/d/optout.