I use filebeat to collcet log.Then push log to kafka ,then use logstash to parse every log record.then push to another kafka again.then use another logstash push the log to elasticsearch.
And the log is splited by '\t'.one field('respone_body') is json format string like below
{"ret":0,"msg":"","data":[]}
how can i convert it to json in ruby filter?
because my elasticsearch defined the response_body as a mapping with object type and dynamic true.
below is my logstash ruby filter.
ruby {
init => "@kname = ['datetime','level','logger','thread','version','respose_body']"
code => 'event.append(Hash[@kname.zip(event["message"].split("\t"))])'
remove_field => ['input_type','beat','host','message','offset','kafka']
}
and this is log, it is a full string with 5 tablespace.
2019-05-09-15:00:04.258 ACCESS ad_service\inc\Router 333 0.1 {"ret":0,"msg":"","data":[]}
I have tried below solution,But still not work.
ruby filter code.
event["json_respone_body"] = event["respone_body"].to_json
Logstatsh error information:
{:timestamp=>"2019-05-10T18:08:19.938000+0800", :message=>"Ruby exception occurred: uninitialized constant LogStash::Filters::Ruby::JSON", :level=>:error}