Hi everyone,
I am using ELK 7.2.
I did read from a past answer here and I found it useful, as what I need to do is to make logstash correctly parse json data, and also an array of nested json objects.
However, if I need to parse both json and also json in form of an array of json objects, what would it be the configuration? For now, I did read this [{},{}]
Is there a way to correctly embed that part of regex to the one I am already using (wrote as an answer on the forum, here below)?
if [message] =~ "\A\{.+\}\z" {
json { .. }
}
To make the setup clearer, because of this problem I tried to temporarily cast every type that passes through logstash as string, so the objects will not be aggregatable by ELK but at least they can be read. I tried this ruby filter
filter {
ruby {
code => '
def stringify(object, name, event)
if object == nil
event.set(name, "nil" )
elsif object.kind_of?(Hash) and object != {}
object.each { |k, v| stringify(v, "#{name}[#{k}]", event) }
elsif object.kind_of?(Array) and object != []
object.each_index { |i|
stringify(object[i], "#{name}[#{i}]", event)
}
else
event.set(name, object.to_s)
end
end
stringify(event.get("extraData"), "[extraData]", event)
'
}
}
and it seem to work, at least on my local configuration. However, in the actual environment (which is also created by Terraform) that does not work.
If I could correctly ingest and parse the source [message] as it should perhaps this ruby filter to cast everything to string is not even needed.
Can you please confirm me the regex above? How can I make such json and json array of objects correctly parsed by logstash?
Many thanks