Move the location of newly added add_field out of nested?

Hi I can access the nested field by [field][sub_field], and add this as a new field. However, this new field is still nested, then it is no use to me. How can i change its position out of net pls?

Below is an example. I add_field created the purchase.source, but it is still nested within purchase field.

Please don't post screenshots. Use copy/paste.

It's hard to understand what you want to accomplish. If you don't want the field to be nested and you create it with add_field why not just change the field name to not be nested? Or do you want the field name to literally be purchase.source? That won't work since field names with dots aren't allowed.

1 Like

Hi thanks for the prompt reply and sorry for the screen shot. It still doesn't work as below.

"@timestamp" => "2016-08-29T17:32:23.321Z",
"path" => "/Users/yangyan/Desktop/log_file/test.log",
"host" => "yangyan-osx",
"type" => "json",
"timestamp" => "Jan 1 00:01:18",
"logsource" => "eqx-astockweb1",
"program" => "adobestock",
"pid" => "59741",
"app_id" => "as",
"geid" => "61dc639d38fb5dc98e6d65a5c5903d1e",
"etid" => "purchase",
"ev" => 1,
"date" => "2015-12-31 23:01:18",
"ip" => "10.1.8.37",
"asui" => "84e48f408d265e2f8ef36f26f75bdbcb",
"session_id" => "2193804e13f5a640afb1a1b80613281f",
"member_id" => -1,
"is_buyer" => false,
"url" => "https://stock.adobe.com/Callback/JEM/Provisioning",
"locale" => "en_US",
"purchase" => {
"source" => [
[0] "jem",
[1] "pSource"
],
"type_id" => 3,
"sao" => "323C16AD55DFE19B0A744C37",
"order_number" => nil,
"content_id" => nil,
"delegate_guid" => nil,
"sku" => "65260923"

AND here is how i do the add__field within filter plugin:
mutate{
add_field => {
"[purchase][source]" => "pSource"
}
}

Okay, this is what your event currently looks like and what configuration you have. What's the expected result then?

Hi my conf is below:
input {
file {
path => "/Users/yangyan/Desktop/log_file/test.log"
start_position => "beginning"
type => "json"
codec => json
}
}

filter {
grok{
match => { "message" => "%{SYSLOGBASE} %{GREEDYDATA:message}" }
overwrite => [ "message" ]
}

json {
source => "message"
}

mutate{
add_field => { "[purchase][source]" => "pSource" }
}

prune {
whitelist_names => [ "timestamp", "app_id", "date", "member_id", "locale", "pSource"]
}
}

output {
elasticsearch { }
stdout { codec => rubydebug }
}

THE original log file is like this:
Jan 1 00:01:18 eqx-astockweb1 adobestock[59741]: {"app_id":"as","geid":"61dc639d38fb5dc98e6d65a5c5903d1e","etid":"purchase","ev":1,"date":"2015-12-31 23:01:18","mt":1451602878.69,"ip":"10.1.8.37","asui":"84e48f408d265e2f8ef36f26f75bdbcb","session_id":"2193804e13f5a640afb1a1b80613281f","member_id":-1,"is_buyer":false,"url":"https://stock.adobe.com/Callback/JEM/Provisioning","locale":"en_US","purchase":{"source":"jem","type_id":3,"sao":"323C16AD55DFE19B0A744C37","order_number":null,"content_id":null,"delegate_guid":null,"sku":"65260923"}}

THE goal is to parse out only the bolded fields. And if possible, convert date to Date type. (currently everything is default to String, since they are parsed out of json)

Pls kindly let me know!

Use the mutate filter's rename option to move [purchase][source] somewhere else. Then use the prune filter to delete all fields except the ones you specifically list.

Thanks a lot, i got it!!! I wouldn't have notice rename would achieve this!

is there a way i can convert the date field, which is parsed to be a string out of json, into Date type pls, except for using complex ruby?

Look at the date filter.

1 Like

Cool, i got it!
Lastly, is there anyway i can show my fields' data type In logstash, Kibana, ES?
Now i can see in kibana, that date field is still String; @timestamp does turns to be Date type...should i name a new indice in kibana maybe?
Thanks~!

Your date field might be a string because the first document with a date field didn't contain a string that ES could parse as a date. It therefore became a string instead, and that's not going to change without reindexing.

But yes, it would be a good idea to stick to the defaults without trying to use date instead of @timestamp etc. Once you understand better how things work you can deviate.

1 Like

Thanks! Does this mean I could not easily convert String type to Date type without reindex, if i need to maintain the default @timestamp as well?

An index's field mappings can't be changed without reindexing.