JSON parsing problem

Hi there!

My target is very simple (as it seems): I want logstash to receive a json from http, process this json with some scrypt, and bypass it as json next.

I've started with json codec, but I've not found any way to properly access the root of event object, so I could not get the whole json content. So, I saw this How to read JSON input sent to Http input plugin in filter section topic and tried to make config like in the answer.

Let's say I've this config now:

input {
  http {
    id => "my_plugin_id"
	port => 12345
	additional_codecs => { }
  }
}

filter {
	json {
		source => "message"
		target => "json"
    }
#one day here will be the ruby scrypt
} 

and this json:
{
"some":
{
"some2":"SOME-AUTOGENERATED-ID1",
"some3":"stat"
},
"user_name":"some4",
"machine_name":"SOME",
"install_ver":"SOME",
"type":"SOME",
"message_desc":"test",
"time":"2002-02-10T16:58:48.000+0200",
"win_ver":"Some version",
"proc_ver":"version"
}

And what I'm trying to do, is using curl to send my json on this port:
curl -H "Content-Type: application/json" -XPOST "localhost:12345" --data-binary @stat2.json

Instead of having the real json in the json field of output, I really have this from logstash:
"json" => {
"install_ver" => "SOME",
"user_name" => "some4",
"machine_name" => "SOME",
"time" => "2002-02-10T16:58:48.000+0200",
"win_ver" => "Windows 10 1703",
"type" => "SOME",
"some" => {
"some2" => "SOME-AUTOGENERATED-ID1",
"some3" => "stat"
},
With => separator between key and value instead of colon. It's definetely not a json.

I've already tried to do
mutate { gsub => ["json", "=>", ':'] }
but it makes no action with json field, only the text or arrays could be here.

I just want to have real json in the "json" field, with colon separators. Of cource I can replace it with the scrypt, but it seems like very common thing, and I'm looking for the native solution.

Could someone here help me, please?

What output are you using? Have you specified a codec on it?

I am using such output:

output {
  stdout { 
        codec => rubydebug 
    } 
}

I've also tried to use just stdout, without a codec.

But I beleive it's not an output setting issue, because if I try to do smth like this in filter:
ruby {code => 'open("test.json", "w") { |file| file.write(event.get("json")) } }
in the test.json file I'll have same => stuff.

And if my information is correct, the pipeline is input -> filter -> output.

The format with "fieldname" => "fieldvalue" is what rubydebug does. It is working as expected. If you want json then just tell it that.

output { stdout { codec => json_lines } }

It really helps, but just for the stdout. Now I've unformatted correct json in the console output, but still => stuff in the file I created through ruby, on filter stage.

Is there such method to fix the separator on filter stage?

You are using a ruby filter to write to a file? Can you show us what that filter looks like? It will probably end up being the same problem in another guise.

Yes.
The filter code looks like this:
filter {
json {
source => "message"
target => "json"
}
ruby {code ='open("test.json", "w") { |file| file.write(event.get("json")) }' }
}

ruby {code => 'open("/tmp/test.json", "w") { |file| file.write(event.get("json").to_json) }' }

Note the .to_json at the end.

.to_json works perfectly, thank you Badger!

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.