Hi all,
Im trying to parse a big json line into logstash. The json is an array of multpiple object. I'd like each object to be an event and each value to be a field. This is my filebeat config:
filebeat.inputs:
- type: log
enabled: true
# Paths that should be crawled and fetched. Glob based paths.
paths:
- "D:/data/*"
processors:
- decode_json_fields:
fields: ["studio_state"]
output.logstash:
# The Logstash hosts
hosts: ["logstash:5151"]
JSON string looks like (but is much more lenghty)
[{
"Location": "0.0,0.0",
"event_name": "Resource Tab Selected",
"main_category": "Sequencer",
"sub_category": "Not available",
"start_timestamp": "2018-08-19T00:17:04.847",
"end_timestamp": "2018-08-19T00:17:04.847"
}, {
"Location": "0.0,0.0",
"event_name": "Resource Tab Selected",
"main_category": "Sequencer",
"sub_category": "Not available",
"start_timestamp": "2018-08-19T00:17:04.847",
"end_timestamp": "2018-08-19T00:17:04.847"
}]
And my logstash config:
input {
tcp {
port => 5000
}
beats {
codec => "json"
port => 5151
}
}
filter {
json {
source => "message"
}
}
## Add your filters / logstash plugins configuration here
output {
elasticsearch {
hosts => "elasticsearch:9200"
index => "%{[@metadata][beat]}-%{[@metadata][version]}-%{+YYYY.MM.dd}"
}
}
I've tried many different combinations but with this one I get this message:
{"monitoring": {"metrics": {"beat":{"cpu":{"system":{"ticks":343,"time":{"ms":78}},"total":{"ticks":389,"time":{"ms":78},"value":389},"user":{"ticks":46}},"info":{"ephemeral_id":"714c1baa-3aa1-4d39-8884-273d6edea87b","uptime":{"ms":180078}},"memstats":{"gc_next":4194304,"memory_alloc":1702152,"memory_total":3961352,"rss":12288}},"filebeat":{"events":{"added":1,"done":1},"harvester":{"open_files":3,"running":3,"started":1}},"libbeat":{"config":{"module":{"running":0}},"pipeline":{"clients":1,"events":{"active":0,"filtered":1,"total":1}}},"registrar":{"states":{"current":3,"update":1},"writes":{"success":1,"total":1}}}}}
Most annoying part to me is : "events":{"active":0,"filtered":1,"total":1
I don't understand why it is dropping the event. When I parse the JSON in a linter, it is valid, if I try to make it in multiple lines instead of one, it goes to ES(logstash) but i creates an event per line.
Any help appreciated.