Hi there,
Consider the following logstash json filter snippet:
json {
skip_on_invalid_json => true
source => "message_body"
target => "jsondoc"
}
which results on the following logstash event:
{
"@timestamp" => 2020-11-27T19:36:46.292Z,
"jsondoc" => {
"taskBegin" => true,
"time" => "2020-09-22T02:39:33.255Z",
"parent" => {
"pid" => 8188,
"taskid" => 34863,
"name" => "PAS"
},
"v" => 0,
"pid" => 23404,
"taskid" => 157694,
"gid" => "En5MFzJZ2JjgLl3HrIfNtA",
"reqAccepted" => true,
"req" => {
"port" => 18681,
"path" => "/v2/searchTasks/GK2kwu8gSE2fzI7dnUaICg/results?continueToken=&limit=100",
"method" => "GET"
},
"level" => 30,
"msg" => "",
"name" => "LoadBalancer"
}
}
which contains values of different types, such as booleans, integers, and strings.
Given the highly dynamic nature of this particular field, I would like to ensure it's parsed by the json plugin such that all values are strings, even booleans and numeric values. This is to avoid exceptions upon outputting to elasticsearch. It's not feasible to map these fields either because, again, they will change depending on the input log which can serialise all sorts of json blobs.
In other words, how can I ensure all my values are strings upon processing them with the logstash json plugin?
Desired example output below:
{
"@timestamp" => 2020-11-27T19:36:46.292Z,
"jsondoc" => {
"taskBegin" => "true",
"time" => "2020-09-22T02:39:33.255Z",
"parent" => {
"pid" => "8188",
"taskid" => "34863",
"name" => "PAS"
},
"v" => "0",
"pid" => "23404",
"taskid" => "157694",
"gid" => "En5MFzJZ2JjgLl3HrIfNtA",
"reqAccepted" => "true",
"req" => {
"port" => "18681",
"path" => "/v2/searchTasks/GK2kwu8gSE2fzI7dnUaICg/results?continueToken=&limit=100",
"method" => "GET"
},
"level" => "30",
"msg" => "",
"name" => "LoadBalancer"
}
}
Thanks in advance