Greetings,
I am a total newbie to Elastic and co, so please educate. I have filebeat set up to export JSON from a log file with lines like this: "08-Apr-2019 01:11:16.047 INFO [http-nio-443-exec-37] MyClass.Func MyClassFunc() :: InfoObject {action: "start", ID: "a1b2c3", state: "STARTED", time: 1554685876047}".
I am trying to get either filebeat to export the embedded stringified JSON, or get Logstash to decode it into such JSON for further use. So far no luck on either one
What I have so far:
- Filebeat config:
filebeat.inputs:
- type: log
enabled: true
paths:- /home/cheeni/Dev/wordly/datainfra/usage-analysis/fbinput.log
processors:
-
dissect:
tokenizer: "%{Date} %{TS} %{Loglevel} %{Processid} %{Class} %{Function} %{Junk} %{ObjectHead} %{MyObject}"
field: "message"
target_prefix: "" -
include_fields:
fields: ["host", "source", "MyObject"]
output.logstash:
hosts: ["localhost:5044"]
codec.json:
pretty: true
Filebeat output looks like this:
2019-04-15T13:13:30.844-0700 DEBUG [publish] pipeline/processor.go:308 Publish event: {
[... @timestamp and @metadata stuff..]
"MyObject": "{action: "start", ID: "a1b2c3", state: "STARTED", time: 1554685876047}",
"host": {
"name": "Mymachine"
}
}
I'd like to get the stringified JSON value of "MyObject" to be proper JSON:
"MyObject": {
action: "start",
ID: "a1b2c3",
state: "STARTED",
time: 1554685876047
}
- Logstash config:
input {
beats {
codec => "json"
port => "5044"
}
}
filter {
json {
source => "message"
target => "newObj"
}
}
output {
stdout { codec => rubydebug }
}
The logstash json filter seems to have no effect on MyObject, and it just spits out the same field and string value that filebeat sends it.
Any guidance?
Thanks