Reading and using elasticsearch metada with logstash configuration

I am having a problem to read a dump file from elasticsearch in another system and push it to my elasticsearch using Logstash via file input plugin. My dump file looks like this:

{"_index":"logstash-2018.06.14","_type":"doc","_id":"9Q-9AGQBaaf188t_6DmH","_score":1,"_source":{"offset":124076,"tags":["filebeat_json","beats_input_raw_event","_jsonparsefailure"],"type":"log","fields":{"type":"filebeat_docker"},"@version":"1","input_type":"log","stream":"stdout","pipeline":"filter-java","source":"/var/lib/docker/containers/781d6b218cbedc67ffaac68b95f71603cc239a177c47c71303f9d2e6d59d9825/781d6b218cbedc67ffaac68b95f71603cc239a177c47c71303f9d2e6d59d9825-json.log","beat":{"name":"os-be-master1.local","version":"5.6.5","hostname":"os-be-master1.local"},"host":"os-be-master1.local","@timestamp":"2018-06-14T23:59:58.129Z"}}
{"_index":"logstash-2018.06.14","_type":"doc","_id":"DQ-9AGQBaaf188t_6DqH","_score":1,"_source":{"offset":145573,"tags":["filebeat_json","beats_input_raw_event","_jsonparsefailure"],"type":"log","fields":{"type":"filebeat_docker"},"@version":"1","input_type":"log","stream":"stdout","pipeline":"filter-java","source":"/var/lib/docker/containers/781d6b218cbedc67ffaac68b95f71603cc239a177c47c71303f9d2e6d59d9825/781d6b218cbedc67ffaac68b95f71603cc239a177c47c71303f9d2e6d59d9825-json.log","beat":{"version":"5.6.5","name":"os-be-master1.local","hostname":"os-be-master1.local"},"host":"os-be-master1.local","@timestamp":"2018-06-14T23:59:59.131Z"}}

with my configuration file as follow:

input{
        file{
                path=> "/home/vm01/Documents/log/output.json"
                type=>"log"
                start_position => "beginning"
                sincedb_path=>"/home/vm01/Documents/sincedb_redefined"
                codec => multiline
                {
                        pattern => '^\{'
                        negate => true
                        what => previous
                }
        }
}

filter{
        if [type] == "log"{
                json{
                        source=>"message"
                }
        } 
}

output{
        if [type] == "log"{
                elasticsearch{
                        hosts=>"localhost:9200"
                        index=>"log-%{+YYYY.MM.dd}"
                }
        } 
}

But it gave me error like this:

[WARN ] 2018-07-10 13:13:53.685 [Ruby-0-Thread-18@[main]>worker7: /usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:385] elasticsearch - Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"logstash-2018.07.10", :_type=>"doc", :_routing=>nil}, #<LogStash::Event:0x17052ccb>], :response=>{"index"=>{"_index"=>"logstash-2018.07.10", "_type"=>"doc", "_id"=>"gvflg2QB1n75DXFZzVPL", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"Field [_type] is a metadata field and cannot be added inside a document. Use the index API request parameters."}}}}

I suspect it is because the dump file already contains all metadata of Elasticsearch from previous VMs and it could not be inserted into the new push. Is there a way for me to use the metadata inside the file rather than the one newly created?

I suggest using the section

filter {
mutate {
rename => { "_type" => "doc_type" }
}
}
In the section output
output {
elasticsearch {
"document_type" => "%{doc_type}"
}
}

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.