I have a json file with 4K json objects in it. Each object is on its own line. I'm using the file plugin with the multiline codec to have Logstash parse all the objects. I'm also flattening the json objects and then sending the output to standard out using the stdout plugin. However, I'm only seeing 9 records print to the screen.
I want to do some aggregation on the data and need to be able to see all the records print to the screen. How can I accomplish this?
After reading a couple of articles in the community I came across @Badger's solution to read in JSON content and I configured my Logstash config like this:
input {
file {
path => "/a/b/file1.json"
sincedb_path => "/dev/null"
start_position => beginning
codec => multiline {
pattern => "^Spalanzani"
negate => true
what => previous
auto_flush_interval => 1
multiline_tag => ""
}
}
}
filter {
json {
source => "message"
remove_field => [ "message" ]
}
if [fields] {
ruby {
code => '
event.get("fields").each { |k, v|
event.set(k,v)
}
event.remove("fields")
'
}
}
if [tags] {
ruby {
code => '
event.get("tags").each { |k, v|
event.set(k,v)
}
event.remove("tags")
'
}
}
date {
match => [ "timestamp", "UNIX" ]
}
}
output {
stdout { codec => rubydebug { metadata => false }}
}