Unable to see the Json data as fields in kibana

I have a sample log file with json content

{ LoanCode:99356, LoanDate:3/2/2018, LoanStatus:OPN}

Here is my logstash conf file

input{
	file{
		path => "D:/elkstack/*.log"
		start_position => beginning
	}
}

filter{
	grok{
		match => { "message" => "%{GREEDYDATA:kvpairs}"  }		
	}
		json{
			source => "kvpairs"
			}  
	}
}

output{
	elasticsearch{
		hosts => "localhost:9200"
		index => "logstash-%{+YYY.MM.dd}"
	}
}

I need to create visualization over the values in the log eg. how many loans are OPN .
I am unable to see these key value pairs in the kibana as fields.

Here is the screen shot of Kibana

This doesn't make any sense as it just copies over the full content of the field and the content isn't valid JSON. Instead try something like this:

grok{
  match => { "message" => "{ %{GREEDYDATA:kvpairs}}"  }		
} 

kv {
  source => "kvpairs"
  value_split => ":"
  field_split_pattern => ", "
}
1 Like

Many thank @Christian_Dahlqvist !!

I tried with your suggestion, it almost worked but I see '}' in the field names.
Can you help me how to avoid them? Do I have to make any changes to the log file pattern?

I modified the grok expression to remove those from the kvpairs string. It looks like you are still using your old pattern.

1 Like

My Bad :frowning:
I was using my old pattern.

When I changed with your code, it worked as expected.

Thank you @Christian_Dahlqvist. You really made my day.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.