Prevent nested json from appearing in elasticsaerch field

Hello, i'm using http filter to do api call to a rest api server. The data returned is json. it looks like this example:

{
    a:{
        b:{
            c:{}
            d:{}
            e:{}
        }
    }
}

I want to prevent logstash from inputing c, d, and e to the elasticsearch field, so the field in elasticsearch will be a and a.b, with a.b field containing value of c,d,e. Any suggestion on how to do this?

So you want to change [a][b] from a hash into a string formed by concatenating the keys from that hash?

the key is automatically mapped to a field in elasticsearch using dynamic mapping. What i want is the dynamic mapping to stop at field [a][b], and let the value of [a][b] to c,d,e, so it doesnt map [a][b][c], [a][b][d], [a][b][e] to a field. Can i do this at logstash level, or should i edit in the elasticsearch template?

Can you show the data structure you want as JSON?

I want data c,d,e not to be indexed as a field by elasticsearch, so in kibana the field is only [a][b], not [a][b][c], [a][b][d]. or [a][b][e]

It is really unclear what you want. Do you want [a][b][c] to be present on the event but not indexed, or do want to change the value of [a][b]?

I do not understand what you mean by that.

i found the solution to what i want, although i found another problem. basically i want to make c,d,e not indexed as field column. So i use json_encode filter on field [a][b] to target field [a][x] and remove [a][b].
my json_encode:

json_encode {
        source => "[a][b]"
        target => "[a][x]"
  }

next, i use gsub to clean/remove the backslash due to json_encode, but it's not working.
my gsub filter is:

mutate{
    gsub => ["[a][x]","[\\]",""]
  }

**edit: nvm it's working. It displays correctly in kibana.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.