I had a file which i want to parse with logstash, but its a json with out quotation marks. Is there any method to parse it?
In the general case it's not easy, but if the JSON objects are sufficiently similar one might be able to use a grok filter. Show us an example message.
example log:.
Jan 3 19:29:43 atlanta1-na1-e query[2452]: { Zqlquery : { Select: { Fieldnames: [, t1, companyid, , t1, locationid, , t1, transactions, , t2, bytes] }, { FieldTypes: [U_INT, (null), (null), U_INT, (null), (null), U_LONG, (null), (null), U_LONG, (null), (null)] }, { Functionames: [COUNT, SUM] }}, { GROUP BY: { Fieldnames: [, t1, userid] }, { FieldTypes: [U_INT, (null), (null)] }}, }
Jan 3 19:29:43 atlanta1-na1-e query[2452]: { ZQL query time: 3.01 secs}
It looks like those events are not going to be very similar unless all queries made are almost the same, so parsing this with grok is going to be painful at best. You probably need a custom parser for this.
Yes, these events are not similar. So any idea which custom parser can be used to handle this?
I imagine you'll want to write a Logstash filter plugin. If you're lucky there's a simple Ruby lexer/parser that can help with the text parsing if you don't want to hand code it.