Removed field mapping is still shown in index

Hi,

I faced error saying "could not index to elastic limit of 9k exceeded". i planned to reduce unwanted fields using remove_field in logstash. my incoming json gets 9k fields. i have applied mutate filter in logstash to remove 7k fields. it has not fixed my problem. though i remove them, i can still see them in index/_mapping and i still get error "could not index to elastic limit of 9k exceeded". i don't want to code and remove fields in json which is time consuming. i want to handle it only by logstash.

Input json : 9k fields
logstash_config as below

filter {

 mutate {
      remove_field => [
        "BandingStats",
        "Bisle",
        "Base",
        "RaidHistory",
        "RaidInfo",
        "RaidInfoExt",
        "RaidMap"
	]
     }

}
result of index/_search
"BandingStats" : [
{
"file_path" : "/opt/workdir/5fd436346378f7bda/PacketData/BandingStats.bin",
"option" : "BandingStats",
"decode_result" : "Pass",
"decode_duration" : 0.003,
"file_name" : "BandingStats.bin",
"collect_duration" : 0.0,
"index" : null,
"collect_result" : "Pass"
}
],
"RaidHistory" : [
{
"file_path" : "/opt/workdir/5fd547458b57578f7bda/PacketData/RaidHistory.bin",
"option" : "RaidHistorys",
"decode_result" : "Pass",
"decode_duration" : 0.002,
"file_name" : "RaidHistory.bin",
"collect_duration" : 0.0,
"index" : null,
"collect_result" : "Pass"
}
],

result of elastic/_mapping
"BandingStates" : {
"properties" : {
"collect_duration" : {
"type" : "keyword"
},
"collect_result" : {
"type" : "keyword"
},
"decode_duration" : {
"type" : "keyword"
},
"decode_result" : {
"type" : "keyword"
},
"file_name" : {
"type" : "keyword"
},
"file_path" : {
"type" : "keyword"
},
"index" : {
"type" : "keyword"
},
"option" : {
"type" : "keyword"
}
}
},
"RaidHistory" : {
"properties" : {
"collect_duration" : {
"type" : "keyword"
},
"collect_result" : {
"type" : "keyword"
},
"decode_duration" : {
"type" : "keyword"
},
"decode_result" : {
"type" : "keyword"
},
"file_name" : {
"type" : "keyword"
},
"file_path" : {
"type" : "keyword"
},
"index" : {
"type" : "keyword"
},
"option" : {
"type" : "keyword"
}
}
}

in the search i still see the removed fields but not with exact structure. it shows some kind of metadata structure which points to bin file

Please help me get rid of this error. i tried prune as well, it also behaves same.

my requirement is simple : i want to ingest only 2k fields from 9k fields. and my mapping also should show only 2k fields

Note: we can clearly see remove_field is making partial working but not complete. we can see _search result and _mapping result is not same. _mapping is how it is in original source file. after remove field filter actually nothing should come, but it brings this kind of metadata which points to bin file data.

Thanks,
Raghu

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.