hey folks,
My log files are already in JSON format and I have full control of how they look. When I use logstash, looks like I have to specify the input as JSON either in File plugin (use codec) or Filter (use json). I want to make logstash consume less CPU resources, say without any parsing, just read and send.
Here are my configurations
logstash 2.1
input file
{"@timestamp":"2018-03-08T22:15:44,267", "className":"myClassName","logLevel":"WARN","threadName":"main","requestId":"","message":"some messages"}
input {
file {
path => "my.json*"
exclude => "*.gz"
sincedb_path => "file.sincedb"
type => "some_tpye"
}
}
filter {
#json {
# source => "message"
#}
mutate {
remove_field => ["@version", "path"]
}
}
output {
stdout{ codec => rubydebug { metadata => true }}
}
The output
{
"message" => "{\"@timestamp\":\"2018-03-08T22:15:44,267\", \"className\":\"myClassName\",\"logLevel\":\"WARN\",\"threadName\":\"main\",\"requestId\":\"\",\"message\":\"some messages\"}",
"type" => "some_tpye",
"@metadata" => {
"path" => "file_path"
}
}
My question is
- how to make logstash read json directly and send to elasticsearch w/o parsing, current file input will set each line as message, which is not needed.
I know JSON filter works, but I want to use less host resources. As far as I know, JSON filter will still validate the input, which may hurt when we have large input.
Thanks!