Logstash Filter help - Newbie question

Hi All,
Sorry I'm just a bit lost on where to start with this one, I was hoping someone would be kind enough to point me in the right direction please?
I have a log that looks like this
{"host":"","message":"Clock advanced by 527 ticks","@version":"1","port":57429,"@timestamp":"2022-01-14T12:12:41.053Z"}
It starts and end's with {} its comma delimited (but other examples have shown that is can have comma's inside of the "" Sections! so a simple CSV filter doesn't seem perfect)
But helpfully it has the data or column header already in each message. IE host, message, @Version, port and @timestamp. So in theory all I need to do is read the first part and that's the name the second is the variable, continue until i hit the end character which is a }

Could I ask for help in the best way to filter these variables out please ready to be sent to Elasticsearch? I'm just not sure on which filter or patterns would make a good start with this one.

Thanks in advance,


Your log is valid json format, why dont you use the json logstash filter ?

Cool, thanks i'll give that a go.
Would you put that as a codec in the input or run it in the filter part?

I got, {"tags":["_jsonparsefailure"]," when putting in the codec into the input.
I'll try the same using the filter now and report back.

Hi, So i moved the json command to the filter.
I have a little more debug information but its still not liking it, any ideas?

input {
 # udp {
 #   port => 5141
    #type => 'hsl'
 # }
  tcp {
    port => 5141
    id => "TCP_INPUT"
    type => "TCP_INPUT"

filter {
  if [type] == "TCP_INPUT" {
    json {
      source => "message"
type or paste code here

Logstash-plain.log shows this

[2022-01-18T16:50:05,892][WARN ][logstash.filters.json    ][main][e929e8c178d8ddd954aa9f2eeb0726d914d7f7df9e7c9f8d2b7020fe33d277a0] Error parsing json {:source=>"message", :raw=>"other mcp_obj_delete (tag=15299) messages", :exception=>#<LogStash::Json::ParserError: Unrecognized token 'other': was expecting ('true', 'false' or 'null')
ype or paste code here

And the output to file shows something like this

{"tags":["_jsonparsefailure"],"port":25326,"@timestamp":"2022-01-18T16:50:05.779Z","message":"[NAT] end_transaction ignored","@version":"1","type":"TCP_INPUT","host":""}
{"tags":["_jsonparsefailure"],"port":25326,"@timestamp":"2022-01-18T16:50:05.779Z","message":"end_transaction -> running 1 ig_commit 1 od_compile 0 od_deploy 0","@version":"1","type":"TCP_INPUT","host":""}
{"tags":["_jsonparsefailure"],"port":25326,"@timestamp":"2022-01-18T16:50:05.779Z","message":"end_transaction calls device_actions()","@version":"1","type":"TCP_INPUT","host":""}
{"tags":["_jsonparsefailure"],"port":25326,"@timestamp":"2022-01-18T16:50:05.779Z","message":"device_actions: on demand feature is not enabled.","@version":"1","type":"TCP_INPUT","host":""}
{"tags":["_jsonparsefailure"],"port":25326,"@timestamp":"2022-01-18T16:50:05.779Z","message":"[NAT] end_transaction -> ig_commit 1","@version":"1","type":"TCP_INPUT","host":""}type or paste code here

and the raw file before the Json config was added looked like this

{"message":"[NAT] end_transaction -> ig_commit 1","@version":"1","port":25243,"@timestamp":"2022-01-18T10:37:39.719Z","host":""}
{"message":"[NAT] end_transaction ignored","@version":"1","port":25243,"@timestamp":"2022-01-18T10:37:39.719Z","host":""}
type or paste code here

I've just worked out, that its valid json as Logstash has done that!
The message field is actually what came in, and logstash has already wrapped other parameters like host and @timestamp around it ready for Elasticsearch to ingest.
So actually i don't need to do anything else with it at all!!!

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.