Having problem with parsing Json data using Filebeat and Logstash

i have this type of json logs:

1485907213693 {"counter_name": "image", "resource_id": "999d4b25-0539-419e-9991-42f2cf26061b", "timestamp": "2017-02-01T00:00:40Z", "counter_volume": 1, "user_id": null, "message_signature": "f9cc0305c9fc95ba9761aded3df342af4e03e8fccc898b1acfd20d6a6041d49d", "resource_metadata": {"status": "active", "name": "OfficialD20.7_v7_raw", "deleted": false, "container_format": "bare", "created_at": "2016-11-11T21:03:21.000000", "disk_format": "raw", "updated_at": "2016-11-15T22:45:51.000000", "protected": false, "min_ram": 0, "checksum": "a185ae0b7314e53d5256", "min_disk": 0, "is_public": false, "deleted_at": null, "properties": {"description": null}, "size": 2147483648}, "source": "open", "counter_unit": "image", "project_id": "03f13ee524ee4ea2afabc839014ff7f2", "message_id": "78209f86-e811-11e6-b78f-0cc47a6431ab", "counter_type": "gauge"}

If I try toparse this using json codec, it's giving me this error:

JSON parse error, original data now in message field {:error=>#<LogStash::Json::ParserError: incompatible json object type=java.lang.Long , only hash map or arrays are supported>,

I know that i have a long number before every json object, any idea how to parse this???

Thanks in advance!!!

You will need to first separate out the initial timestamp from the rest, e.g. using a grok filter, before you can apply a json filter to the rest. The json codec will not work with data in this format as it is not all valid JSON.

what timestamp are u talking about???
1485907213693 This no. is the only thing i need to separate right??

which filter to use for that...??

That looks like a millisecond timestamp. Use a dissect filter or a grok filter to store the timestamp in one field and the rest in another that you can then apply a json filter on.

yeah thanks Christian. I'll try that.

hey Christian, I tried to parse these logs (removed the timestamp that was before json)

{"counter_name": "image", "resource_id": "999d4b25-0539-419e-9991-42f2cf26061b", "timestamp": "2017-02-01T00:00:40Z", "counter_volume": 1, "user_id": null, "message_signature": "f9cc0305c9fc95ba9761aded3df342af4e03e8fccc898b1acfd20d6a6041d49d", "resource_metadata": {"status": "active", "name": "OfficialD20.7_v7_raw", "deleted": false, "container_format": "bare", "created_at": "2016-11-11T21:03:21.000000", "disk_format": "raw", "updated_at": "2016-11-15T22:45:51.000000", "protected": false, "min_ram": 0, "checksum": "a185ae0b7314e53d5256", "min_disk": 0, "is_public": false, "deleted_at": null, "properties": {"description": null}, "size": 2147483648}, "source": "open", "counter_unit": "image", "project_id": "03f13ee524ee4ea2afabc839014ff7f2", "message_id": "78209f86-e811-11e6-b78f-0cc47a6431ab", "counter_type": "gauge"}

Here filebeat sent the events to logstash, but logstash printed this in the console and didn't send any data to ES or create any Index.

Message on Logstash console:

2017-09-25T13:29:40.883Z L9608 %{message}
2017-09-25T13:29:40.883Z L9608 %{message}
2017-09-25T13:29:40.883Z L9608 %{message}
2017-09-25T13:29:40.883Z L9608 %{message}
2017-09-25T13:29:40.883Z L9608 %{message}
2017-09-25T13:29:40.883Z L9608 %{message}
2017-09-25T13:29:40.883Z L9608 %{message}
2017-09-25T13:29:40.883Z L9608 %{message}
2017-09-25T13:29:40.883Z L9608 %{message}
2017-09-25T13:29:40.883Z L9608 %{message}
2017-09-25T13:29:40.883Z L9608 %{message}

Logstash.conf

input {
	beats {
		port => 5044
		codec => json
		type => ceilometer
	}
}

filter {
	if [type] == "ceilometer" {
		date {
			match => [ "timestamp", "YYYY-MM-dd'T'HH:mm:ss'Z'" ]
			remove_field => "timestamp"
			timezone => "UTC"
		}
		date {
			match => ["[resource_metadata][created_at]", "YYYY-MM-dd HH:mm:ss.SSSSSS"]
			remove_field => "[resource_metadata][created_at]"
			target => "[resource_metadata][created_at_parsed]"
			timezone => "UTC"
		}
		
	}
}

output {
	stdout { }
	elasticsearch {
		hosts => ["http://localhost:9200"]
		index => "Ceilometer_logs"
		document_type => "json_ceilometer_type"
	}
}

Filebeat.yml

filebeat.prospectors:


- input_type: log

  paths:
    #- /var/log/*.log
    - \Users\ab66415\Desktop\ELK\dileep_docs\ceilometer_logs.txt

scan_frequency: 5s

backoff: 3s

json.keys_under_root: true
json.add_error_key: true
close_inactive: 5m

output.logstash:
  # The Logstash hosts
  hosts: ["localhost:5044"]

Any idea why it's happening???

While troubleshooting the Logstash configuration, comment out the elasticsearch output and only use a stdout output with a rubydebug codec. This will show you the structure of the events you are processing and make it easier to find issues.

i am getting these two errors, Could you suggest something

failed to parse [resource_metadata.created_at]

Invalid format: "2016-11-21 22:51:59+00:00" is malformed at " 22:51:59+00:00"

Comment out your date filters so you can see the structure of the event without any errors. you may also need to comment out the json codec in the beats input if that turns out to be causing problems. Then add filter by filter until it works as you want it to.

Thanks a lot, christian!! :slight_smile:

If you encounter further issues and need assistance, it helps if you can show what the resulting event looks like.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.