Hello! I have a problem with my mapping. I get a lot of JSON log files and I can not tell what data type the fields will have.
So if my first log has a field content: <<string>>
and my next log has maybe this content: <<object>>
I don't want to loose the whole log document. I found ignore_malformed
and use logstash to preprocess my logs.
I want the mapping to be exactly like it is without any modification (dynamic type mapping + dynamic field indexing if there are new fields and also not analyzed .raw fields). The following is my mapping:
{
"settings": {
"index.mapping.ignore_malformed": true
},
"mappings": {
"_default_": {
"dynamic": "true",
"_all" : {
"omit_norms" : true,
"enabled" : true
},
"dynamic_templates" : [{
"string_fields" : {
"mapping" : {
"index" : "analyzed",
"omit_norms" : true,
"type" : "string",
"fields" : {
"raw" : {
"index" : "not_analyzed",
"type" : "string"
}
}
},
"match_mapping_type" : "string",
"match" : "*"
}
}]
}
}
}
In logstash I've done this:
output {
stdout {
codec => rubydebug { metadata => true }
}
elasticsearch {
hosts => "elasticsearch:9200"
index => "%{[@metadata][es_index_full]}"
user => ["logstash"]
password => ["xxxxxxxxxx"]
template => "/etc/logstash/conf.d/mapping.json"
manage_template => true
template_overwrite => true
}
}
It seems that logstash uses my Mapping. But it does not work. I still loose my documents where the type is not matching and I'm missing my .raw
fields.
What do I have to do?
Many Thanks in advance
Daniel