Hello i try a complex json using split plugin
But, the result is not my intent.
Json file is like this.
{
"col1" : "0C35C",
"col2" : 0,
"col3" : "was",
"col4" : "AAF",
"col5" : "20190329",
"col6" : [
{"test1":"Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:66.0) Gecko/20100101 Firefox/66.0","test2":"/test/test.img","test3":"/favicon.ico","test4":1553827094.070707,"test5":"16.21.57.7","test6":4448},
{"test1":"Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:66.0) Gecko/20100101 Firefox/66.0","test2":"/test/test.img","test3":"/favicon.ico","test4":1553827094.070707,"test5":"161.178.2.54","test6":48}
]
}
Logstash config is like this.
input {
file {
path => "/tmp/test.json"
start_position => "beginning"
sincedb_path => "/dev/null"
codec => "json"
}
}
filter {
json {
source => "message"
}
split {
field => "col6"
}
}
output {
stdout {}
}
and result is ...
{
"test1" => "Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:66.0) Gecko/20100101 Firefox/66.0",
"@version" => "1",
"host" => "kafka.novalocal",
"test3" => "/favicon.ico",
"@timestamp" => 2019-04-15T09:50:24.289Z,
"test2" => "/test/test.img",
"test5" => "161.178.2.54",
"test4" => 1553827094.070707,
"tags" => [
[0] "_split_type_failure"
],
"test6" => 48,
"path" => "/tmp/test.json"
}
{
"test1" => "Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:66.0) Gecko/20100101 Firefox/66.0",
"@version" => "1",
"host" => "kafka.novalocal",
"test3" => "/favicon.ico",
"@timestamp" => 2019-04-15T09:50:24.287Z,
"test2" => "/test/test.img",
"test5" => "16.21.57.7",
"test4" => 1553827094.070707,
"tags" => [
[0] "_split_type_failure"
],
"test6" => 4448,
"path" => "/tmp/test.json"
}
{
"@timestamp" => 2019-04-15T09:50:24.273Z,
"message" => "{",
"@version" => "1",
"tags" => [
[0] "_jsonparsefailure",
[1] "_split_type_failure"
],
"host" => "kafka.novalocal",
"path" => "/tmp/test.json"
}
{
"@timestamp" => 2019-04-15T09:50:24.282Z,
"message" => " \"col5\" : \"20190329\",",
"@version" => "1",
"tags" => [
[0] "_jsonparsefailure",
[1] "_split_type_failure"
],
"host" => "kafka.novalocal",
"path" => "/tmp/test.json"
}
{
"@timestamp" => 2019-04-15T09:50:24.276Z,
"message" => " \"col1\" : \"0C35C\",",
"@version" => "1",
"tags" => [
[0] "_jsonparsefailure",
[1] "_split_type_failure"
],
"host" => "kafka.novalocal",
"path" => "/tmp/test.json"
}
{
"@timestamp" => 2019-04-15T09:50:24.291Z,
"message" => " ]",
"@version" => "1",
"tags" => [
[0] "_jsonparsefailure",
[1] "_split_type_failure"
],
"host" => "kafka.novalocal",
"path" => "/tmp/test.json"
}
{
"@timestamp" => 2019-04-15T09:50:24.281Z,
"message" => " \"col4\" : \"AAF\",",
"@version" => "1",
"tags" => [
[0] "_jsonparsefailure",
[1] "_split_type_failure"
],
"host" => "kafka.novalocal",
"path" => "/tmp/test.json"
}
{
"@timestamp" => 2019-04-15T09:50:24.293Z,
"message" => "}",
"@version" => "1",
"tags" => [
[0] "_jsonparsefailure",
[1] "_split_type_failure"
],
"host" => "kafka.novalocal",
"path" => "/tmp/test.json"
}
{
"@timestamp" => 2019-04-15T09:50:24.279Z,
"message" => " \"col3\" : \"was\",",
"@version" => "1",
"tags" => [
[0] "_jsonparsefailure",
[1] "_split_type_failure"
],
"host" => "kafka.novalocal",
"path" => "/tmp/test.json"
}
{
"@timestamp" => 2019-04-15T09:50:24.283Z,
"message" => " \"col6\" : [",
"@version" => "1",
"tags" => [
[0] "_jsonparsefailure",
[1] "_split_type_failure"
],
"host" => "kafka.novalocal",
"path" => "/tmp/test.json"
}
{
"@timestamp" => 2019-04-15T09:50:24.278Z,
"message" => " \"col2\" : 0,",
"@version" => "1",
"tags" => [
[0] "_jsonparsefailure",
[1] "_split_type_failure"
],
"host" => "kafka.novalocal",
"path" => "/tmp/test.json"
}
How can is solve this problem?