Hello,
I just started with ES and used 1.2.1 and the Hadoop interface to push log
data into Hadoop (HDP 2.1) via a Hive external table having defined fields.
That worked great.
Our event logs are JSON and I next want to dynamically index the entire
document instead of just designated fields. However some fields can come
through as, say, String, in some records, while at other times that same
field will be a complex nested type of some sort. This upset ES because it
expects the field, once indexed, to stay the same type.
Any suggestions as the best way to deal with this? I'm looking at ways to
flatten in the incoming structures but maybe ES has something built-in
that's more elegant.
Thanks
Ken
--
You received this message because you are subscribed to the Google Groups "elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email to elasticsearch+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/1a708d5f-06fc-4ba6-8351-72f0be6a59c7%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.