I have JSON-ified logs coming into my Logstash server. What's the difference between putting the json codec in
input {
codec => json
}
vs.
filter {
json => {
source => "message"
}
}
I have JSON-ified logs coming into my Logstash server. What's the difference between putting the json codec in
input {
codec => json
}
vs.
filter {
json => {
source => "message"
}
}
The json filter and the json codec accomplish the same thing. One point of the json filter is that you can apply it selectively, e.g. for just one of the fields and only under certain conditions. You can probably always use a json filter instead of a json codec but the opposite isn't true.
Thanks. I'm having _jsonparsefailure when I use the json codec so I'm trying the json filter approach.
Using the json codec, my fields are automagicaly separated for me, that is, I see them in Kibana. How can I extract my fields using the json filter? My log events look like this.
{
"key1":"value1",
"key2":"value2",
"msg":"2017-03-06 INFO [com.company.app] Hello world"
}
I tried
json => {
source => "message"
add_field => {
"key1" => "%{[message][key1]}"
"key2" => "%{[message][key2]}"
"log_msg" => "%{[message][msg]}"
}
Show your full configuration and what your events look like when they leave Logstash. Use a stdout { codec => rubydebug }
output so we can see exactly what's going on.
Here's the JSON log event
{
"key1":"value1",
"key2":"value2",
"msg":"2017-03-06 INFO [com.company.app] Hello world"
}
Here's my configuration file
input {
udp {
port => 5555
}
}
filter {
mutate {
strip => "message"
}
json {
source => "message"
# What do I put here to pick apart the fields in "msg" so my grok filter works?
}
grok {
match => {
"msg" => {
"%{TIMESTAMP_ISO8601:logdate}%{SPACE}%\[(?<classname>[^\]]+)\]%{SPACE}%{GREEDYDATA:msgbody}"
}
}
}
}
output {
stdout {
codec => rubydebug
}
}
Here's what i see in logstash.stdout
{
"message" => "{\"key1\":\"value1\", \"key2\":\"value2\", \"msg\":\"2017-03-06 INFO [com.company.app] Hello world\"}",
"@version" => "1",
"@timestamp" => "2017-03-06",
"host" => "0:0:0:0:0:0:0:1"
}
Here's what I want to see in logstash.stdout, which is what I see when I use the JSON "codec."
{
"message" => "{\"key1\":\"value1\", \"key2\":\"value2\", \"msg\":\"2017-03-06 INFO [com.company.app] Hello world\"}",
"key1" => "value1",
"key2" => "value2",
"classname" => "com.company.app",
"loglevel" => "INFO",
"msgbody" => "Hello world",
"@version" => "1",
"@timestamp" => "2017-03-06",
"host" => "0:0:0:0:0:0:0:1
}
I think I got it now. This Stack Overflow post gave me ideas. So my filter plugin looks like this now
filter {
json {
source => "message"
target => "parsedJson"
}
mutate {
add_field => {
"key1" => "%{[parsedJson][key1]}"
"key2" => "%{[parsedJson][key2]}"
"log_message" => "%{[parsedJson][msg]}"
}
}
grok {
match => {
"log_message" => [
# Grok filters go here.....
]
}
}
mutate {
remove_field => ["message", "parsedJson", "log_message"]
}
}
Alas, the json filter gives the same _jsonparsefailure as the json codec. I should've expected that, though I was hoping otherwise.
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.
© 2020. All Rights Reserved - Elasticsearch
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant logo are trademarks of the Apache Software Foundation in the United States and/or other countries.