I have an ES on linux and I index the documents using Nest framework. But the problem is there is one object that containes dynamic field. For exp: there is a class named Product and one of its properties is dynamic which means mapping changes everytime according to the new request. Lets say name of the dynamic property is Data and Data might be an object or string or integer or list of obejct ...etc. it might be everything. I don't know it's normal but there is no other option to do it so far.
1 gb data is stored every day into the es but when I search for a data which was indexed 1 or 2 weeks ago on kibana, it can not find it. It sais "Timeout ..." . So far there are 1800 fields in my index.
My question is what is the best practice for dynamic field mapping ? What should I do?
I am trying to use the index as a log storage. I have like 20 applications and these apps creates logs (request and response). I consume these log data as json in my elastic search consumer application. It means I have json string on my elastic search client application. I convert every json data to jobject and than insert them to elasticsearch index.
Thats why there are like 1800 fields, every project has its own type. So what is your recommendation or what is the best practice to do this ?
Spliting them into different indices might cause a searching performance ? I mean Kibana search for one index now, but if I split it will start searching for all the indices. Wont this cause performance problems ?
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.