How can i convert to complex json file

Hello i try a complex json using split plugin
But, the result is not my intent.

Json file is like this.

{
   "col1" : "0C35C",
   "col2" : 0,
   "col3" : "was",
   "col4" : "AAF",
   "col5" : "20190329",
   "col6" : [
      {"test1":"Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:66.0) Gecko/20100101 Firefox/66.0","test2":"/test/test.img","test3":"/favicon.ico","test4":1553827094.070707,"test5":"16.21.57.7","test6":4448},
	  {"test1":"Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:66.0) Gecko/20100101 Firefox/66.0","test2":"/test/test.img","test3":"/favicon.ico","test4":1553827094.070707,"test5":"161.178.2.54","test6":48}      
   ]
}

Logstash config is like this.

input {
	file {
		path => "/tmp/test.json"	
		start_position => "beginning"
		sincedb_path => "/dev/null"
		codec => "json"
	}
}

filter {
	json {
		source => "message"
	}
	split {
		field => "col6"
	}
}

output {
	stdout {}
}

and result is ...

{
         "test1" => "Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:66.0) Gecko/20100101 Firefox/66.0",
      "@version" => "1",
          "host" => "kafka.novalocal",
         "test3" => "/favicon.ico",
    "@timestamp" => 2019-04-15T09:50:24.289Z,
         "test2" => "/test/test.img",
         "test5" => "161.178.2.54",
         "test4" => 1553827094.070707,
          "tags" => [
        [0] "_split_type_failure"
    ],
         "test6" => 48,
          "path" => "/tmp/test.json"
}
{
         "test1" => "Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:66.0) Gecko/20100101 Firefox/66.0",
      "@version" => "1",
          "host" => "kafka.novalocal",
         "test3" => "/favicon.ico",
    "@timestamp" => 2019-04-15T09:50:24.287Z,
         "test2" => "/test/test.img",
         "test5" => "16.21.57.7",
         "test4" => 1553827094.070707,
          "tags" => [
        [0] "_split_type_failure"
    ],
         "test6" => 4448,
          "path" => "/tmp/test.json"
}
{
    "@timestamp" => 2019-04-15T09:50:24.273Z,
       "message" => "{",
      "@version" => "1",
          "tags" => [
        [0] "_jsonparsefailure",
        [1] "_split_type_failure"
    ],
          "host" => "kafka.novalocal",
          "path" => "/tmp/test.json"
}
{
    "@timestamp" => 2019-04-15T09:50:24.282Z,
       "message" => "   \"col5\" : \"20190329\",",
      "@version" => "1",
          "tags" => [
        [0] "_jsonparsefailure",
        [1] "_split_type_failure"
    ],
          "host" => "kafka.novalocal",
          "path" => "/tmp/test.json"
}
{
    "@timestamp" => 2019-04-15T09:50:24.276Z,
       "message" => "   \"col1\" : \"0C35C\",",
      "@version" => "1",
          "tags" => [
        [0] "_jsonparsefailure",
        [1] "_split_type_failure"
    ],
          "host" => "kafka.novalocal",
          "path" => "/tmp/test.json"
}
{
    "@timestamp" => 2019-04-15T09:50:24.291Z,
       "message" => "   ]",
      "@version" => "1",
          "tags" => [
        [0] "_jsonparsefailure",
        [1] "_split_type_failure"
    ],
          "host" => "kafka.novalocal",
          "path" => "/tmp/test.json"
}
{
    "@timestamp" => 2019-04-15T09:50:24.281Z,
       "message" => "   \"col4\" : \"AAF\",",
      "@version" => "1",
          "tags" => [
        [0] "_jsonparsefailure",
        [1] "_split_type_failure"
    ],
          "host" => "kafka.novalocal",
          "path" => "/tmp/test.json"
}
{
    "@timestamp" => 2019-04-15T09:50:24.293Z,
       "message" => "}",
      "@version" => "1",
          "tags" => [
        [0] "_jsonparsefailure",
        [1] "_split_type_failure"
    ],
          "host" => "kafka.novalocal",
          "path" => "/tmp/test.json"
}
{
    "@timestamp" => 2019-04-15T09:50:24.279Z,
       "message" => "   \"col3\" : \"was\",",
      "@version" => "1",
          "tags" => [
        [0] "_jsonparsefailure",
        [1] "_split_type_failure"
    ],
          "host" => "kafka.novalocal",
          "path" => "/tmp/test.json"
}
{
    "@timestamp" => 2019-04-15T09:50:24.283Z,
       "message" => "   \"col6\" : [",
      "@version" => "1",
          "tags" => [
        [0] "_jsonparsefailure",
        [1] "_split_type_failure"
    ],
          "host" => "kafka.novalocal",
          "path" => "/tmp/test.json"
}
{
    "@timestamp" => 2019-04-15T09:50:24.278Z,
       "message" => "   \"col2\" : 0,",
      "@version" => "1",
          "tags" => [
        [0] "_jsonparsefailure",
        [1] "_split_type_failure"
    ],
          "host" => "kafka.novalocal",
          "path" => "/tmp/test.json"
}

How can is solve this problem?

You are already decoding the message using the codec, so why have a separate JSON filter?

Thanks your help.
But, even if remove the JSON filter, problem is not solved.
In my opinion, the JSON filter is unnecessary, but it is not the cause of the problem.

Your data does not seem to have any col6 field. What does the input look like?

You may want to add a conditional around the split filter and only perform this if the assumed field is present.

'col6' is exist.

Json file is like this.

{
   "col1" : "0C35C",
   "col2" : 0,
   "col3" : "was",
   "col4" : "AAF",
   "col5" : "20190329",
   "**col6**" : [
      {"test1":"Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:66.0) Gecko/20100101 Firefox/66.0","test2":"/test/test.img","test3":"/favicon.ico","test4":1553827094.070707,"test5":"16.21.57.7","test6":4448},
	  {"test1":"Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:66.0) Gecko/20100101 Firefox/66.0","test2":"/test/test.img","test3":"/favicon.ico","test4":1553827094.070707,"test5":"161.178.2.54","test6":48}      
   ]
}

And i want these results
Is it possible to parse this way?

{
   "col1" : "0C35C",
   "col2" : 0,
   "col3" : "was",
   "col4" : "AAF",
   "col5" : "20190329",
   "test1":"Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:66.0) Gecko/20100101 Firefox/66.0",
   "test2":"/test/test.img",
   "test3":"/favicon.ico",
   "test4":1553827094.070707,
   "test5":"16.21.57.7",
   "test6":4448
}
{
   "col1" : "0C35C",
   "col2" : 0,
   "col3" : "was",
   "col4" : "AAF",
   "col5" : "20190329",
   "test1":"Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:66.0) Gecko/20100101 Firefox/66.0",
   "test2":"/test/test.img",
   "test3":"/favicon.ico",
   "test4":1553827094.070707,
   "test5":"161.178.2.54",
   "test6":48
}

It did not seem to exist in the results you posted above.

You need to use a multiline codec to combine all the lines from a single JSON object into one event. Is there more than one object in a file?

1 Like

Thanks your comment

A json file have just only one object.
And i solved the problem through the way you comment me.
(I combined all the lines into a single event with a multiline codec and then parsed it.)

Have a good day~

The problem is solved.
Thank you for reviewing my problem together.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.