I'd probably use an ingest pipeline and reindex the whole data with the reindex API.
So you will have at the end a correct mapping instead of keeping old non needed fields.
i created a pipeline and verified it using _ingest/pipeline/mypipeline/_simulate API. Its working fine.
However in the actual setup, it just rejects the document & nothing gets index.
If it isn't indexing, where can I see the rejected/failed to index doc ?
Also besides this, I need to make some dynamic fields (which is a known set). Is there a way to convert this static attributes.
ie. "name": "parameter1"
"value" : "value1"
to "parameter1":"value1"
I need this transformation so that i can use the kibana dashboard. appreciate your support
I tried to put all the parameters and create a mapping file, but it don't seem to work..Is it because every message has only one parameter (i.e. domain attribute) e.g. gear_pos,location. Any suggestion on the mapping to be created for visualisation in kibana.
When I create it is saying that the index pattern don't have any compatible geo_point field
I created the elastic search index with the below mappings via., postman.
After issuing this command, then send the data. But the data isn't getting indexed in elasticsearch
I meant the data source is started., so the elastic search starting receiving data..However it isn't getting indexed..
However if I don't create the index manually (as described above) prior to sending the data, an automatic index is created by elastic search and the location field datatype is automatically classified as "text" instead of geo_point.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.