Instead of trying to use the multiline filter to pick out each object and then mutate it into valid JSON, I would take the whole file as a single event and then restructure the result. For the input I would use
file {
path => "/home/ec2-user/t.test/foo.txt"
codec => multiline {
pattern => "^Spalanzani"
negate => true
what => "previous"
auto_flush_interval => 1
multiline_tag => ""
}
start_position => "beginning"
sincedb_path => "/dev/null"
}
Note that file paths must be absolute. You cannot use "./temp/*.json"
Then restructure it using
json { source => "message" target => "[@metadata][json]" remove_field => [ "message" ] }
ruby {
code => '
json = event.remove("[@metadata][json]")
if json.is_a? Hash
newJson = []
json.each { |k, v|
newJson << v.merge({ "dname" => k })
}
event.set("[@metadata][dname]", newJson)
end
'
}
split { field => "[@metadata][dname]" }
ruby {
code => '
d = event.remove("[@metadata][dname]")
if d.is_a? Hash
d.each { |k, v|
event.set(k, v)
}
end
'
}