The doc says that to access the _source
you need to write:
ctx['_source']
More at https://www.elastic.co/guide/en/elasticsearch/painless/6.6/painless-update-by-query-context.html
That's may be your problem?
The doc says that to access the _source
you need to write:
ctx['_source']
More at https://www.elastic.co/guide/en/elasticsearch/painless/6.6/painless-update-by-query-context.html
That's may be your problem?
hello David,
I'm using version 6.3 and according to this doc it seems like I have the correct syntax but I will try with what you indicated and post my findings,
https://www.elastic.co/guide/en/elasticsearch/painless/6.3/painless-examples.html
thanks
hello david,
i just tried this
POST filebeat-6.0.0-2018.12.18/_update_by_query
{
"query": {
"match_all": {}
},
"script": {
"source": "ctx._source['@timestamp'] = OffsetDateTime.parse(ctx._source['@timestamp']).plusYears(1)"
}
}
and got this
{
"error": {
"root_cause": [
{
"type": "illegal_argument_exception",
"reason": "cannot write xcontent for unknown value of type class java.time.OffsetDateTime"
}
],
"type": "illegal_argument_exception",
"reason": "cannot write xcontent for unknown value of type class java.time.OffsetDateTime"
},
"status": 400
}
I have no clue on what the syntax is here since the docs point me in all different directions, please let me know what I'm missing here.
thanks
hey david,
i got it to work by using the tostring method,
POST filebeat-6.0.0-2018.12.18/_update_by_query
{
"query": {
"match_all": {}
},
"script": {
"source": "ctx._source['@timestamp'] = OffsetDateTime.parse(ctx._source['@timestamp']).plusYears(1).toString()"
}
}
how do i get my new data with the new timestamp to show up under cat indices. For example i modified the filebeat-6.0.0-2018.12.18, 2018.12.17 and 2018.11.17 and added 1 year to them. I am assuming i should see 3 new indicies with the new year but I am only seeing one, 2019.12.18. Under kibana discover tab the documents show up for under the current month with the same document count as last year but when i do _cat/indices the count is not correct.
green open filebeat-6.0.0-2018.12.17 ZB2RB6VqRqavxNYkp8uZjw 5 1 82199 3323 42.7mb 21.7mb
green open filebeat-6.0.0-2019.02.19 Etq3ubN_R5KIl53Wi_FP6g 5 1 62364 0 38.7mb 19.4mb
green open filebeat-6.0.0-2018.12.18 x7476yM1Tqqd7iITM0lH9g 5 1 35396 0 18.5mb 9.2mb
green open filebeat-6.0.0-2019.02.10 L4mQJzCeRjuTLZPWB0PKdg 5 1 144127 0 77.7mb 38.8mb
green open filebeat-6.0.0-2019.02.12 B712uGiIRjmrEeNyl5F7rA 5 1 69121 0 41.4mb 20.7mb
green open filebeat-6.0.0-2019.02.11 mICGDhjFRRqCdHIqt2PrLQ 5 1 190740 0 141.7mb 70.9mb
green open filebeat-6.0.0-2018.11.16 SpqmHavsS2mMeNR0loY4Kw 5 1 756767 78627 792.3mb 400.2mb
green open filebeat-6.0.0-2018.11.15 wf8vC6nOTn-QHi-eijOr8Q 5 1 878074 0 424.3mb 210.9mb
green open filebeat-6.0.0-2019.12.18 aX1w-bAPTJ6FvI4jYhu2GQ 5 1 5013 0 3.4mb 1.7mb
green open filebeat-6.0.0-2019.03.04 uG5mp1RKQIWhK548rodM2w 5 1 63362 0 59.3mb 31.5mb
noticed it created the 2019.12.18 index automatically but the other two are missing.
You need to reindex them in the new indices. So you need to change _index
metadata basically.
So it's not a update by query that you need to run but a reindex.
okay so i need to reindex it into the correct index and then run update by query to change the timestamp right?
main goal here is that a bulk of data got inserted into elasticsearch with the wrong timestamp and i need to move it into another index with the correct timestamp.
I think you can do both at the same time.
You can run a script directly from reindex API or if not enough flexible you can define an ingest pipeline where you do the date transformation like we saw before and define the index name with https://www.elastic.co/guide/en/elasticsearch/reference/current/date-index-name-processor.html
Thanks for your help with this issue David, much appreciated.
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.
© 2020. All Rights Reserved - Elasticsearch
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant logo are trademarks of the Apache Software Foundation in the United States and/or other countries.