Joseph
(Joseph)
January 26, 2017, 12:20pm
1
I am new to logstash and grok filters. My sample logs look like this
{"level":"INFO", "date":"01/24/2017", "time":"14:22:05", "text":"[Audit]:RBAC Bundle Started"},
{"level":"INFO", "date":"01/24/2017", "time":"14:22:09", "text":"[Audit]:Started Catalog Manager Service"},
{"level":"WARN", "date":"01/24/2017", "time":"14:22:09", "text":"[Audit]:Got null for resourceServerBasepath"},
I am getting these logs in port 5044.
input {
beats {
port => 5044
ssl => true
ssl_certificate => "/etc/pki/tls/certs/logstash-forwarder.crt"
ssl_key => "/etc/pki/tls/private/logstash-forwarder.key"
}
}
filter {
grok {
match => { "message" => '"%{WORD:level}":"%{WORD:type}", "%{WORD:date}":"%{DATE_US:date}", "%{WORD:time}":"%{TIME:timerange}", "%{WORD:text}":"%{GREEDYDATA:information}'
}
add_field => [ "received_at", "%{@timestamp }" ]
}
}
output {
elasticsearch {
hosts => ["localhost:9200"]
sniffing => true
manage_template => false
index => "%{[@metadata ][beat]}-%{+YYYY.MM.dd}"
document_type => "%{[@metadata ][type]}"
}
I tried the grok debugger online and the sample data was parsed correctly. But when I employ the same in ELK stack, unparsed data gets pushed into Elasticsearch. Any idea what might be causing the trouble? Any help is appreciated.
It's a proper JSON log, then why not to use JSON filter for this?
Getting JSON without a trailing comma and then parsing it:
filter {
grok {
match => { "message" => "%{GREEDYDATA:json}," }
add_field => [ "received_at", "%{@timestamp}" ]
}
json {
source => "json"
}
mutate {
remove_field => [ "message" ]
}
}
For this I get (in rubydebug
):
{
"date" => "01/24/2017",
"path" => "/root/logs/test.log",
"@timestamp" => 2017-01-26T12:59:05.683Z,
"level" => "INFO",
"received_at" => "2017-01-26T12:59:05.683Z",
"@version" => "1",
"host" => "virt",
"json" => "{\"level\":\"INFO\", \"date\":\"01/24/2017\", \"time\":\"14:22:05\", \"text\":\"[Audit]:RBAC Bundle Started\"}",
"time" => "14:22:05",
"text" => "[Audit]:RBAC Bundle Started"
}
1 Like
Joseph
(Joseph)
January 26, 2017, 2:57pm
4
Hi. Thanks for your reply. I wasn't sure if logstash JSON filter parses the "date" field as DATE or simply as another string and so I was not sure if I could use it. Will I be able to query the "date" field in Elasticsearch with this filter?
Then make the following mutate
and date
after the json
:
mutate {
add_field => { "datetime" => "%{date} %{time}" }
}
date {
match => [ "datetime", "MM/dd/yyyy HH:mm:ss" ]
target => "timestamp"
}
Now you can use timestamp
as a time field in Kibana and ElasticSearch.
{
"level" => "WARN",
"path" => "/root/logs/test.log",
"datetime" => "01/24/2017 14:22:09",
"@timestamp" => 2017-01-26T15:30:46.083Z,
"received_at" => "2017-01-26T15:30:46.083Z",
"@version" => "1",
"host" => "virt",
"json" => "{\"level\":\"WARN\", \"date\":\"01/24/2017\", \"time\":\"14:22:09\", \"text\":\"[Audit]:Got null for resourceServerBasepath\"}",
"text" => "[Audit]:Got null for resourceServerBasepath",
"timestamp" => 2017-01-24T13:22:09.000Z
}
1 Like
Joseph
(Joseph)
January 27, 2017, 9:26am
6
Thanks for your inputs! They were very helpful. I am able to query the data using "timestamp". Just to make it perfect, is it possible to replace "@timestamp " with "timestamp", so that when I open kibana discover, the data is sorted according to "timestamp" and not according to "@timestamp ".
Yes, the target
in date
have a default value of @timestamp
.
Then just delete those line and it will write the timestamp into the @timestamp
.
1 Like
Joseph
(Joseph)
January 27, 2017, 10:29am
8
I think I am already doing that.
My filter and output look like this:
filter {
grok {
match => { "message" => "%{GREEDYDATA:json}," }
add_field => [ "received_at", "%{@timestamp }" ]
}
json {
source => "json"
}
mutate {
add_field => { "datetime" => "%{date} %{time}" }
remove_field => [ "message" ]
}
date {
match => [ "datetime", "MM/dd/yyyy HH:mm:ss" ]
target => "datetime"
}
}
output {
elasticsearch {
hosts => ["localhost:9200"]
sniffing => true
manage_template => false
index => "filebeat-%{+YYYY.MM.dd}"
document_type => "%{[@metadata][type]}"
}
stdout { codec => rubydebug }
}
Though I change my target to "datetime", Kibana seems to take @timestamp as the primary time field.
system
(system)
Closed
February 24, 2017, 10:29am
9
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.