-
1 : In kibana Time field is showing string , not as date field.
-
2 : this is my sample log file..
{"Metrics":"type=COUNTER, name=FacilityMgmtResource Counter, count=0,Time:2017-08-16_01:51:14.926"}
{"Metrics":"type=METER, name=systems.ellora.core.exception.UnhandledRuntimeMapper, count=0, mean_rate=0.0, m1=0.0, m5=0.0, m15=0.0, rate_unit=events/second,Time:2017-08-16_01:51:17.699"}
{"Metrics":"type=TIMER, name=FacilityMgmtResource Timer, count=0, min=0.0, max=0.0, mean=0.0, stddev=0.0, median=0.0, p75=0.0, p95=0.0, p98=0.0, p99=0.0, p999=0.0, mean_rate=0.0, m1=0.0, m5=0.0, m15=0.0, rate_unit=events/second, duration_unit=milliseconds,Time:2017-08-16_01:51:17.700"} -
3 : my filter:
filter {
csv {
source => "message"
}
csv {
source => "Metrics"
}
date {
match => ["Time","yyyy-MM-dd_HH:mm:ss,SSS"]
target => "Time"
}
kv{
value_split => "="
}
}
Why are you using a csv filter to parse the message
field? The input lines contain JSON data.
Show what an event produced by Logstash looks like. Copy/paste from Kibana's JSON tab or use a stdout { codec => rubydebug }
output. Format everything you paste as preformatted text.
its a json file with csv format, and when i try with json ,i get a json parsing failure error in logstash.log
{
"_index": "filebeat-2017.08.16",
"_type": "log",
"_id": "AV3qPHYgJmWWVh1I-bI8",
"_version": 1,
"_score": null,
"_source": {
"mean_rate": "0.11281360328467671",
"offset": 211,
"m1": "0.2",
"input_type": "log",
"count": "1",
"m5": "0.2",
"rate_unit": "events/second",
"Time": "2017-08-16_12:29:59,457",
"source": "/ext/logs/perff.log",
"message": "{"Metrics" : " type=METER , name=systems.ellora.core.exception.NoDataFoundMapper , count=1 , mean_rate=0.11281360328467671 , m1=0.2 , m5=0.2 , m15=0.2 , rate_unit=events/second , Time=2017-08-16_12:29:59,457 "}",
"type": "METER",
"tags": [
"beats_input_codec_plain_applied"
],
"m15": "0.2",
"@timestamp": "2017-08-16T08:50:12.152Z",
"@version": "1",
"beat": {
"hostname": "sdatabase",
"name": "sdatabase",
"version": "5.5.0"
},
"host": "sdatabase",
"name": "systems.ellora.core.exception.NoDataFoundMapper",
"{"Metrics"": " type=METER , name=systems.ellora.core.exception.NoDataFoundMapper , count=1 , mean_rate=0.11281360328467671 , m1=0.2 , m5=0.2 , m15=0.2 , rate_unit=events/second , Time=2017-08-16_12:29:59,457 "
},
"fields": {
"@timestamp": [
1502873412152
]
},
"sort": [
1502873412152
]
}
Still time as string field
Format everything you paste as preformatted text.
its a json file with csv format, and when i try with json ,i get a json parsing failure error in logstash.log
What's the error? The message
field looks okay to me. I don't think this looks like CSV at all. Use a json filter or codec, then a kv filter to parse the Metrics
field.
Your problem is that the date filter comes before the kv filter so that the Time
field doesn't exist when the date filter runs.
When that has been fixed and the date filter works you need to reindex the existing data so the Time
field can be detected as a date field. If you don't care about the existing data in the index you can just delete the index.
-
I reindexed.
-
After changing filter to json: This is the result
{
"_index": "filebeat-2017.08.16",
"_type": "log",
"_id": "AV3qc2UWJmWWVh1I-r8i",
"_version": 1,
"_score": 2,
"_source": {
"mean_rate": "0.0,",
"Time": "2017-08-16_02:54:53.211"}",
"source": "/ext/logs/perff.log",
"type": "log",
"p95": "0.0,",
"p98": "0.0,",
"p75": "0.0,",
"m15": "0.0,",
"p99": "0.0,",
"min": "0.0,",
"@version": "1",
"beat": {
"hostname": "sdatabase",
"name": "sdatabase",
"version": "5.5.0"
},
"host": "sdatabase",
"p999": "0.0,",
"stddev": "0.0,",
"{"Metrics":"type": "TIMER,",
"offset": 937,
"m1": "0.0,",
"max": "0.0,",
"input_type": "log",
"count": "0,",
"m5": "0.0,",
"rate_unit": "events/second,",
"message": "{"Metrics":"type=TIMER, name=FacilityMgmtResource Timer, count=0, min=0.0, max=0.0, mean=0.0, stddev=0.0, median=0.0, p75=0.0, p95=0.0, p98=0.0, p99=0.0, p999=0.0, mean_rate=0.0, m1=0.0, m5=0.0, m15=0.0, rate_unit=events/second, duration_unit=milliseconds, Time=2017-08-16_02:54:53.211"}",
"duration_unit": "milliseconds,",
"tags": [
"beats_input_codec_plain_applied"
],
"@timestamp": "2017-08-16T09:50:11.563Z",
"median": "0.0,",
"mean": "0.0,",
"name": "FacilityMgmtResource",
"{"Metrics"": "type=TIMER, name=FacilityMgmtResource Timer, count=0, min=0.0, max=0.0, mean=0.0, stddev=0.0, median=0.0, p75=0.0, p95=0.0, p98=0.0, p99=0.0, p999=0.0, mean_rate=0.0, m1=0.0, m5=0.0, m15=0.0, rate_unit=events/second, duration_unit=milliseconds, Time=2017-08-16_02:54:53.211"
},
"fields": {
"@timestamp": [
1502877011563
]
}
} -
My filter
filter {
json {
source => "message"
}
json {
source => "Metrics"
}
kv{
value_split => "="
}
date {
match => ["Time","yyyy-MM-dd_HH:mm:ss,SSS"]
target => "Time"
}
}
And why this field is set like this??? Like after Time value => "}"
Why are you repeatedly ignoring my request that you post pasted log data as preformatted text?
I'm not spending any more time on this until you post logs and configuration that hasn't been mangled.
Sorry for the inconvenience, this is my sample log data
{"Metrics":"type=COUNTER, name=FacilityMgmtResource Counter, count=0, Time=2017-08-16_02:54:53.180"} {"Metrics":"type=METER, name=FacilityMgmtResource Meter, count=0, mean_rate=0.0, m1=0.0, m5=0.0, m15=0.0, rate_unit=events/second, Time=2017-08-16_02:54:53.180"} {"Metrics":"type=METER, name=systems.ellora.core.exception.NoDataFoundMapper, count=1, mean_rate=0.9657515433602983, m1=0.0, m5=0.0, m15=0.0, rate_unit=events/second, Time=2017-08-16_02:54:53.180"} {"Metrics":"type=METER, name=systems.ellora.core.exception.UnhandledRuntimeMapper, count=0, mean_rate=0.0, m1=0.0, m5=0.0, m15=0.0, rate_unit=events/second, Time=2017-08-16_02:54:53.180"} {"Metrics":"type=TIMER, name=FacilityMgmtResource Timer, count=0, min=0.0, max=0.0, mean=0.0, stddev=0.0, median=0.0, p75=0.0, p95=0.0, p98=0.0, p99=0.0, p999=0.0, mean_rate=0.0, m1=0.0, m5=0.0, m15=0.0, rate_unit=events/second, duration_unit=milliseconds, Time=2017-08-16_02:54:53.211"} {"Metrics":"type=COUNTER, name=FacilityMgmtResource Counter, count=1, Time=2017-08-16_02:54:54.032"} {"Metrics":"type=METER, name=FacilityMgmtResource Meter, count=1, mean_rate=0.5260485800519025, m1=0.0, m5=0.0, m15=0.0, rate_unit=events/second, Time=2017-08-16_02:54:54.043"} {"Metrics":"type=METER, name=systems.ellora.core.exception.NoDataFoundMapper, count=1, mean_rate=0.5268436948972698, m1=0.0, m5=0.0, m15=0.0, rate_unit=events/second, Time=2017-08-16_02:54:54.043"} {"Metrics":"type=METER, name=systems.ellora.core.exception.UnhandledRuntimeMapper, count=0, mean_rate=0.0, m1=0.0, m5=0.0, m15=0.0, rate_unit=events/second, Time=2017-08-16_02:54:54.053"} {"Metrics":"type=TIMER, name=FacilityMgmtResource Timer, count=0, min=0.0, max=0.0, mean=0.0, stddev=0.0, median=0.0, p75=0.0, p95=0.0, p98=0.0, p99=0.0, p999=0.0, mean_rate=0.0, m1=0.0, m5=0.0, m15=0.0, rate_unit=events/second, duration_unit=milliseconds, Time=2017-08-16_02:54:54.053"} {"Metrics":"type=COUNTER, name=FacilityMgmtResource Counter, count=1, Time=2017-08-16_02:54:55.032"} {"Metrics":"type=METER, name=FacilityMgmtResource Meter, count=1, mean_rate=0.3448138089928063, m1=0.0, m5=0.0, m15=0.0, rate_unit=events/second, Time=2017-08-16_02:54:55.042"} {"Metrics":"type=METER, name=systems.ellora.core.exception.NoDataFoundMapper, count=1, mean_rate=0.3439532315922082, m1=0.0, m5=0.0, m15=0.0, rate_unit=events/second, Time=2017-08-16_02:54:55.052"} {"Metrics":"type=METER, name=systems.ellora.core.exception.UnhandledRuntimeMapper, count=0, mean_rate=0.0, m1=0.0, m5=0.0, m15=0.0, rate_unit=events/second, Time=2017-08-16_02:54:55.053"}
- Filter conf
filter { json { source => "message" } json { source => "Metrics" } kv{ value_split => "=" } date { match => ["Time","yyyy-MM-dd_HH:mm:ss,SSS"] target => "Time" } }
Three problems:
- The second json filter doesn't make sense because the
Metrics
field doesn't contain JSON. - The kv filter must be configured to parse the
Metrics
field. - The date filter uses the wrong pattern (comma vs. period right before the milliseconds).
thank you so much @magnusbaeck
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.