Logstash json in text message

Hi all :slight_smile:

I have a log line that looks like this:
time='2017-08-16 12:39:43.22253194 +0000 UTC' container_name='/r-stack' source='stdout' data='{"name":"stack","hostname":"bc202dc10bc9","pid":28,"level":50,"msg":"this is a message","time":"2017-08-16T12:39:43.106Z","v":0}'

I first use a grok to match time, container name, source and data.
In data there is the json string. But how can I tell logstash this is a json string which should be interpreted as json?

I tried it with json_encode, but I am not able to install it:
./logstash/bin/logstash-plugin install logstash-filter-json_encode results in:

Picked up _JAVA_OPTIONS: -Djava.net.preferIPv4Stack=true
Validating logstash-filter-json_encode
Installing logstash-filter-json_encode
Picked up _JAVA_OPTIONS: -Djava.net.preferIPv4Stack=true
Error Bundler::InstallError, retrying 1/10
An error occurred while installing logstash-core-event-java (5.2.2), and Bundler cannot continue.
Make sure that gem install logstash-core-event-java -v '5.2.2' succeeds before bundling.
WARNING: SSLSocket#session= is not supported

and trying to gem install logstash-core-event-java -v '5.2.2' results in

ERROR: Could not find a valid gem 'logstash-core-event-java' (= 5.2.2), here is why:
Found logstash-core-event-java (5.2.2), but was for platform java

Can you help me on how to proceed from here? Do I really need the json_encode or is there another possibility?
I am using logstash 5.2.2

Thanks in advance :slight_smile:


Use a kv filter to parse the line (no need to use grok), then apply a json filter (not json_encode) to the resulting data field.

Hi Magnus,

thanks for your reply.
I changed my config file use the kv filter and it works like a charm for my example I provided :slight_smile:

But in production I have a data string that counts about 600 characters. Can I tell logstash to fully parse this data line? Because it does not get fully parsed and therefore I get a jsonparsefailure and the exception:
:exception=>#<LogStash::Json::ParserError: Unexpected end-of-input in VALUE_STRING


Can't really help without seeing the data.

This is a example data:
time='2017-08-16 12:39:43.22253194 +0000 UTC' container_name='/r-stack-messenger' source='stdout' data='{"name":"service","hostname":"hostname1","pid":28,"level":50,"msg":"no-kafka-client Metadata request failed: NoKafkaConnectionError [kafka.test.com:9093]: Error: connect ECONNREFUSED\n{ [NoKafkaConnectionError: Error: connect ECONNREFUSED]\n name: 'NoKafkaConnectionError',\n server: 'kafka.test.com:9093',\n message: 'Error: connect ECONNREFUSED' } \n[at apply (/app/node_modules/no-kafka/node_modules/lodash/lodash.js:482:27)]","time":"2017-08-16T12:39:43.106Z","v":0}'

The kv filter fails because of the single quotes inside the data value. If data always is the last key=value pair you can use a grok filter to extract everything that comes after "data=" to a field so that the whole string can be sent to the json filter.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.