Logstash assign key value pair into each variables

Hello All,

I'm 3 day new to log stash and during this learning process, I tried to filter the unstructured log which contains key & value pairs and want to assign each values to variables.

Unstructured log:

017-04-04 16:01:16,726 - test.test. - DEBUG - notify_event=[u'[{"Key1":"Value1","Key2":Value2,"Key3": Value3,"}]']

Configuration file:
input {
file {
path => "/home/vallikkv/ELK/log/messages"
type => "messages"
}
}

filter {
grok {
match => { "message" => "%{TIMESTAMP_ISO8601} - test.test. - %{LOGLEVEL} - notify_event=%{GREEDYDATA:msg}" }
}
json{
source => msg
}
}

output {
stdout{}
}

Questions:

  1. How to get the values of each keys of the GREEDYDATA or msg and save it in a file?
  2. How can I pass only these key & value pairs to elasticsearch?

It looks like the JSON part of the log is malformed. If it was valid JSON, your configuration probably should work as long as you modify the GREEDYMESSAGE to just capture {"Key1":"Value1","Key2":Value2,"Key3":"Value3"}.

Is "Value2" and "Value3" really not double-quoted like "Value1" is?

Yes, Double quoted contains string values and no quotes for numbers/integers.

Eg: {"Name": "Vallinayagam", "Age": 30}

Given that the example is invalid JSON and looks made up, it might be better to show a real event.

Real event:
2017-04-13 11:50:58,272 - test.TEST. - DEBUG - notify_event=[u'[{"Description":"Card expid","Location":"Main Door","LogTime":"2017-04-13T11:50:58","OccurenceTime":"2017-04-13T11:50:57","Token":"123456","Type":4,"OperatorDetail":{"FirstName":"Simon","LastName":"Rampling"}}]']

Okay, so what follows "notify_event=" looks like a stringified Python array containing a single string value. You can use grok to only capture what's between notify_event=[u' and '], then feed the extracted JSON string to a json filter.

1 Like

Thanks for pointing out, I changed the grok filter as below

filter {
grok {
match => { "message" => "%{TIMESTAMP_ISO8601} - test.TEST. - %{LOGLEVEL} - notify_event=[%{WORD}'[%{GREEDYDATA:msg}]']" }
}
json{
source => msg
}
}

and it worked. But I've some basic questions w.r.t display in kibana.

Configured output as below:

output {
elasticsearch {
hosts => ["localhost:9200"]
}
stdout{codec => "rubydebug"}
}

  1. How to view these logs in kibana & search in elasticsearch?
  2. How these outputs as redirected to elasticsearch? Is it realtime data or how to save it in a file?

Any suggestions?

Thanks.

How to view these logs in kibana & search in elasticsearch?

Go to Kibana, configure an index pattern that matches your index(es), and you're done. With your configuration Logstash is going to write to indexes named logstash-YYYY.MM.DD, where YYYY.MM.DD is the date when an event occurred.

How these outputs as redirected to elasticsearch? Is it realtime data or how to save it in a file?

Logstash sends the data to Elasticsearch continuously. If things are working okay events written to log files should be searchable in Kibana within a few seconds.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.