How to parse JSON field obtained from split filter

Hello all,

I have used the split filter in order to parse a multiline XML and as a result, one of the fields obtained is the one called parsed.logEntry.logTimestamp as you can see in the following image:

imagen

What I would like now, is a way to have this date information in a timestamp form as interpreted by kibana to manage indexes.

I have tried to used the json filter as following:

	if [type] == "xml" {
	xml {
		namespaces => {
			"xsi" => "http://www.w3.org/2001/XMLSchema-instance"
		}
		store_xml => true
		source => "message"
		target => "parsed"
	}
	
	split{
		field => "[parsed][logEntry]"
	}
	
	json{
		source => "%{[parsed][logEntry][logTimestamp]}"
		target => "parsed_json"
		add_field => {
			days => "%{[parsed_json][day]}"
		}
	}
	
	
	mutate{
		remove_field => ["message"]
	}
}

Do you have any idea?

Thank you all.

If none of the fields ever have leading zeroes then this should work:

mutate { add_field => { "ts" => "%{year}/%{month}/%{day} %{hour}:%{minute}:%{second}" } }
date { match => [ "ts", "yyyy/M/d H:m:s" ] }

Hello Badger, thank you a lot for your answer.

I have tried what you mentioned but what I am getting as "ts" parameter is the literal string: "%{year}/%{month}/%{day} %{hour}:%{minute}:%{second}". So, it seems like in some way we have to extract the values from the parsed.logEntry.Timestamp field.

As a check, I have added a new field with the value obtained from [parsed][logEntry][logTimestamp] and the result was: {month=11, hour=9, year=2013, day=4, second=12, minute=24}. Therefore, I have tried to extract the fields with a grok/dissect filter but without success.

Any Idea? Am I wrong?

Thank you a lot.

Sorry, clearly I wasn't concentrating when I wrote that :slight_smile:

If the parsed.logEntry.logTimestamp object has all those sub-fields then it would be more like...

"%{[parsed][logEntry][logTimestamp][year]}/%{[parsed][logEntry][logTimestamp][month]}/%{[parsed][logEntry][logTimestamp][day]} %{[parsed][logEntry][logTimestamp][hour]}:%{[parsed][logEntry][logTimestamp][minute]}:%{[parsed][logEntry][logTimestamp][second]}".

To be honest I was ignoring your filter and just looking at the JSON you posted from Kibana.

We might do better if you removed the json filter and showed what %{[parsed][logEntry][logTimestamp]} looks like in output { stdout { codec => rubydebug } }. Sometime (not always) there is a better idiomatic solution in logstash.

Hello Badger,

I have printed the result from "parse.logEntry.Timestamp" field by console and what I have obtained is the following:

{month=11, hour=9, year=2013, day=4, second=12, minute=24}.

However, logstash is also printing the literal string %{[parsed][logEntry][logTimestamp] several time, but I assume it is normal.

So, you are right, it is not JSON format even from kibana is viewed as in the image I provided.

What I think is that this field should be parsed with grok, right?

Thank you a lot.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.