drj
(drj)
April 3, 2019, 9:14pm
1
I am in the process trying to change my logstash config from grok
to json
... Also Is there a way to have all fields injson
to become fields over in kibana, rather than the whole json blob being stuck in the message field? So it can be easy to filter things like level
for e.g.
Here is my logstash config:
input {
cloudwatch_logs {
start_position => "end"
log_group => "aws_logs"
log_group_prefix => [ "true" ]
}
}
filter {
grok {
match => { "message" => "%{TIMESTAMP_ISO8601}\t%{UUID:[lambda][request_id]}\t%{GREEDYDATA:message}" }
overwrite => [ "message" ]
tag_on_failure =>
}
grok {
match => { "message" => "(?:START|END) RequestId: %{UUID:[lambda][request_id]}" }
tag_on_failure => []
}
grok {
match => { "message" => "REPORT RequestId: %{UUID:[lambda][request_id]}\tDuration: %{BASE16FLOAT:[lambda][duration]} ms\tBilled Duration: %{BASE16FLOAT:[lambda][billed_duration]} ms \tMemory Size: %{BASE10NUM:[lambda][memory_size]} MB\tMax Memory Used: %{BASE10NUM:[lambda][memory_used]} MB" }
tag_on_failure => []
}
mutate {
convert => {
"[lambda][duration]" => "integer"
"[lambda][billed_duration]" => "integer"
"[lambda][memory_size]" => "integer"
"[lambda][memory_used]" => "integer"
}
}
}
output {
amazon_es {
hosts => ["es-test.amazonaws.com "]
region => "us-east-1"
index => "logs-%{+YYYY.MM.dd}"
}
stdout { codec => rubydebug }
}
Badger
April 3, 2019, 9:38pm
2
That's what a json filter does - parse JSON into fields on the event.
json { source => "message" }
drj
(drj)
April 4, 2019, 1:04pm
3
i changed my config to : `input {
cloudwatch_logs {
start_position => "end"
log_group => [ "/aws" ]
log_group_prefix => [ "true" ]
}
}
filter{
json{
source => "message"
}
}
output {
amazon_es {
hosts => ["logs.endpoint:443"]
region => "us-east-1"
index => "logs-%{+YYYY.MM.dd}"
}
stdout { codec => rubydebug }
}`
Now i get this failure message - _jsonparsefailure
Badger
April 4, 2019, 1:17pm
4
What does the rubydebug output look like for one event?
drj
(drj)
April 4, 2019, 1:32pm
5
Here is an example.
"lambda" => {
"request_id" => "xxxxxxxxxxxxxxxxxxxx"
},
"@version " => "1",
"message" => "START RequestId: xxxxxxxxxxxxxxxxxx Version: $LATEST\n",
"@timestamp " => 2019-04-02T03:55:09.xxxx,
"cloudwatch_logs" => {
"log_group" => "/aws",
"log_stream" => "2019/04/02/xxxxxxxxxxxxxxxxx",
"event_id" => "xxxxxxxxxxxxxxxxxxxx",
"ingestion_time" => 2019-04-02T03:55:21.503Z
}
Badger
April 4, 2019, 2:16pm
6
There is no JSON in that message.
drj
(drj)
April 4, 2019, 2:32pm
7
"@timestamp" => 2019-04-04T12:59xxxxxxxx,
"cloudwatch_logs" => {
"event_id" => "xxxxxxxxxxxxxxxxxxxxxx",
"ingestion_time" => 2019-04-04T13:00:01.xxxx,
"log_stream" => "2019/04/04/[$LATEST]xxxxxxxxxx",
"log_group" => "/aws/xxxxx/"
},
"message" => "Adding env: DB_NAME\n",
"tags" => [
[0] "_jsonparsefailure"
],
"@version " => "1"
system
(system)
Closed
May 2, 2019, 2:33pm
8
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.