Logstash

Hi,

I am using Logstash to migrate data from ES 5.6 to ES 7.9. I am aware of the changes between those versions. I have the following mapping for a document in 5.6:

    "transaction_information": {
                    "dynamic": "false",
                    "date_detection": false,
                    "properties": {
                        ...
                        "something": {
                            "type": "nested",
                            "properties": {
                                "key": {
                                    "type": "keyword"
                                },
                                "value": {
                                    "type": "keyword"
                                }
                            }
                        },
                        "ts_insert": {
                            "type": "date",
                            "format": "date_time"
                        }
                        ...
                    }
                }

I copied that mapping to a new ES 7.9 index and run logstash with the following config:

    input {
          elasticsearch {
            hosts => "localhost:9201"
            index => "old-index"
            query => '{ "query": { "bool": { "must": { "term": { "_type": "transaction_information" } } } } }'
            size => 1000
            scroll => "1m"
            codec => 'json'
            docinfo => true
          }
        }

    output {
          elasticsearch {
            ilm_enabled => false
             hosts => "localhost:9200"
             index => "reindex-%{[@metadata][_index]}"
             document_type => "%{[@metadata][_type]}"
             document_id => "%{[@metadata][_id]}"
          }
    }

Documents I am moving often include multiple fields with different date formats in there e.g:

{
                                "value": "2020-12-01T14:19:55.833Z",
                                "key": "RequestTime"
                            },
                            {
                                "value": "2020-12-01",
                                "key": "endDate"
                            },

I successfully indexed such documents manually to Elasticsearch 7.9 but I obviously want automation.

For some reason, Logstash fails to do this and tries to treat string values in my documents as dates.

I am getting the following error from logstash:

[2021-03-31T13:46:50,549][WARN ][logstash.outputs.elasticsearch][main][4499cf5fbc14f7516629c384c78b1bce9e827f2bcf2163a07746aa719c0e3a43] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>"AXazXUVd3sQvsc-sUI7R", :_index=>"reindex", :routing=>nil, :_type=>"transaction_information"}, #<LogStash::Event:0x6de56487>], :response=>{"index"=>{"_index"=>"reindex-calculator-2020-12", "_type"=>"transaction_information", "_id"=>"AXazXUVd3sQvsc-sUI7R", "status"=>400, "error"=>{"type"=>"illegal_argument_exception", "reason"=>"mapper [transactionParams.value] cannot be changed from type [text] to [date]"}}}}

I have no idea what to do now. I'll be grateful for any input.

Hi,

The transactionParams.value field has to be a date field instead of a text field.

Use the elasticsearch update mapping api to make the change.

Cad.

Thanks for replying.

It's obvious it cannot be date as the field is called value and can hold anything.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.