Create pattern in logstash

Greetings,
My logs are of the format
2017-12-08 01:42:23,658 app="capture", level=ERROR, thread=play-thread-5, logger=play, userId="client:release", reqAction="GET", reqUrl="/api/labels", message="

console.App.html action not found

pls help me in breaking them with expression
as i want timestamp,capture,loglevel

I have applied %{TIMESTAMP_ISO8601:timestamp} %{WORD:action}

a help would be really appreciated

Regards
Shrikant

Use ^%{TIMESTAMP_ISO8601:timestamp} %{GREEDYDATA:kv} to get a timestamp and the key/value pairs into one field each, then use a kv filter to parse the kv field that you get from the grok filter.

1 Like

Please can you elaborate how to use the KV filter please along with an example.
Regards
Shrikant

The Description section in the documentation contains an example and an explanation. In your case you'll have to adjust the field_split option since your fields are separated by ", ".

https://www.elastic.co/guide/en/logstash/current/plugins-filters-kv.html

1 Like

@magnusbaeck thankyou for the reply.
I applied
filter {
if [type] == "plateu" {
grok {
match => {"message" => "%{TIMESTAMP_ISO8601:timestamp} %{GREEDYDATA:kv}"}
}
date {
match => ["timestamp", "yyyy-MM-dd HH:mm:ss,SSS", "ISO8601"]
timezone => "UTC"
}
}
kv {
field_split => ",?"
remove_field => ["app"]
}
}

but i am facing the issue as i am getting field
as 658 app inplace of app

Regards
Shrikant

Try using DATA instead of GREEDYDATA.

1 Like

Thankyou for the help @magnusbaeck and what about these kind of logs

2017-01-05 17:01:39 [WARN ] ~ fail-safe cleanup (collections) as i want loglevel and i am not getting loglevel

For that kind of message you need a different grok expression. A single grok filter can list multiple expressions that will be tried in order, so you can have one expression that attempts to match a general message with a log level and one that extract key/value pairs.

@magnusbaeck thankyou, yes I filtered the logs but now i am facing a issue as whenever i run my logstash with the config file which is

input {
beats {
port => 5044
}
}
filter {
if [type] == "plateu" {
grok {
match => {"message" => "%{TIMESTAMP_ISO8601:timestamp} %{GREEDYDATA:kv}"}
}
date {
match => ["timestamp", "yyyy-MM-dd HH:mm:ss.SSS", "ISO8601"]
timezone => "UTC"
}
}
kv {
field_split => ",?"
}
else if [type] == "manage" {
grok {
match => {"message" => "%{TIMESTAMP_ISO8601:timestamp} [%{LOGLEVEL:level} ]"}
}
date {
match => ["timestamp", "yyyy-MM-dd HH:mm:ss.SSS", "ISO8601"]
timezone => "UTC"
}
}
}
output {
if [type] == "plateu" {
elasticsearch {
hosts => "localhost:9200"
index => "denmark1"
}
}
else if [type] == "hill" {
elasticsearch {
hosts => "localhost:9200"
index => "denmark1"
}
}
stdout {}
}

when i didnt use kv its working fine but when i am using it i am facing a issue it says
[2017-12-14T16:54:59,871][ERROR][logstash.agent ] Cannot create pipeline {:reason=>"Expected one of #, { at line 19, column 7 (byte 374) after filter {\n\tif [type] == "plateu" {\n\t grok {\n\t match => {"message" => "%{TIMESTAMP_ISO8601:timestamp} %{GREEDYDATA:kv}"}\n\t }\n date {\n\t match => ["timestamp", "yyyy-MM-dd HH:mm:ss.SSS", "ISO8601"]\n\t timezone => "UTC"\n\t }\n }\n kv {\n field_split => ",?"\n }\n\telse "}

Please help!

Your kv filter falls outside the if conditional because you have two } between the date and kv filters.

@magnusbaeck in output i am also facing dateparse failure

Show an example event produced by Logstash. Copy/paste from Kibana's JSON tab.

{
"_index": "monk1",
"_type": "manage",
"_id": "AWBU0-UExB2eoZuZIuyo",
"_version": 1,
"_score": null,
"_source": {
"@timestamp": "2017-12-14T11:40:23.051Z",
"offset": 179,
"level": "WARN",
"@version": "1",
"input_type": "log",
"beat": {
"hostname": "TLA-LT-LINUX-004",
"name": "TLA-LT-LINUX-004",
"version": "5.4.0"
},
"host": "TLA-LT-LINUX-004",
"source": "/home/niked/Downloads/log/one.log",
"message": "2017-01-05 17:01:39 [WARN ] ~ fail-safe cleanup (collections) : org.hibernate.engine.loading.CollectionLoadContext@6d59e67brs=com.mchange.v2.c3p0.impl.NewProxyResultSet@18b3517",
"type": "manage",
"tags": [
"beats_input_codec_plain_applied",
"_dateparsefailure"
],
"timestamp": "2017-01-05 17:01:39"
},
"fields": {
"@timestamp": [
1513251623051
]
},
"sort": [
1513251623051
]
}

I'm a bit surprised that the ISO8601 pattern isn't able to parse the timestamp field, but the "yyyy-MM-dd HH:mm:ss.SSS" pattern is obviously wrong since timestamp doesn't have millisecond resolution.

@magnusbaeck Thankyou for the reply
I am having two different timestamp format for a no of files
for eg
09:41:58,932 WARN ~ Tried to associate with unreachable remote address . Address is now gated for 5000 ms, all messages to this address will be delivered to dead letters.

2016-12-14 09:29:37,750 [WARN ] ~ Tried to associate with unreachable remote address. Address is now gated for 5000 ms, all messages to this address will be delivered to dead letters.

what should i do.

So the date is sometimes missing? That's going to be a problem. Logstash will default to today's date when you only have the time so if you're processing the logs in real time you'll be okay most of the time.

As for how to parse it, both the grok and date filters support multiple expressions that'll get tried in order. You need one set of expressions for date+time and one with time only.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.