Logstash 7.2 can not match date on JSON data from Cisco telemetry

Hello !

I'm a newbie on logstash and trying to decode cisco telemetry data in json format. My code in logstash are the following:
filter {
if [type] == "xjson" {
json {
source => "message"
}
mutate {
remove_field => [ "message" ]
}
date {
match => [ "[Rows][Timestamp]", "UNIX_MS", "UNIX"]
}
date {
match => [ "[Telemetry][msg_timestamp]", "UNIX_MS", "UNIX" ]
}
}
}
The json data are the following:
{
"_index": "logstash-2019.07.21-000001",
"_type": "_doc",
"_id": "5fRfFGwBWU64Pkw1pznQ",
"_version": 1,
"_score": null,
"_source": {
"@timestamp": "2019-07-21T11:51:48.864Z",
"type": "xjson",
"@version": "1",
"Telemetry": {
"collection_id": 124647,
"msg_timestamp": 1563709831176,
"collection_end_time": 1563709831185,
"node_id_str": "ncs55",
"subscription_id_str": "int2",
"collection_start_time": 1563709831176,
"encoding_path": "Cisco-IOS-XR-infra-statsd-oper:infra-statistics/interfaces/interface/latest/generic-counters"
},
"Rows": [
{
"Content": {
"last-discontinuity-time": 1563431358,
"packets-received": 0,
"runt-packets-received": 0,
"seconds-since-packet-sent": 4294967295,
"input-queue-drops": 0,
"multicast-packets-sent": 0,
"seconds-since-packet-received": 4294967295,
"bytes-sent": 0,
"packets-sent": 0,
"input-aborts": 0,
"seconds-since-last-clear-counters": 0,
"parity-packets-received": 0,
"output-underruns": 0,
"availability-flag": 0,
"throttled-packets-received": 0,
"crc-errors": 0,
"framing-errors-received": 0,
"broadcast-packets-received": 0,
"output-buffers-swapped-out": 0,
"applique": 0,
"broadcast-packets-sent": 0,
"input-drops": 0,
"output-drops": 0,
"resets": 0,
"input-ignored-packets": 0,
"output-buffer-failures": 0,
"output-queue-drops": 0,
"giant-packets-received": 0,
"input-errors": 0,
"last-data-time": 1563709820,
"multicast-packets-received": 0,
"output-errors": 0,
"input-overruns": 0,
"carrier-transitions": 0,
"unknown-protocol-packets-received": 0,
"bytes-received": 0
},
"Timestamp": 1563709831182,
"Keys": {
"interface-name": "Null0"
}
},
{
"Content": {
"last-discontinuity-time": 1563431445,
"packets-received": 0,
"runt-packets-received": 0,
"seconds-since-packet-sent": 4294967295,
"input-queue-drops": 0,
"multicast-packets-sent": 0,
"seconds-since-packet-received": 4294967295,
"bytes-sent": 0,
"packets-sent": 0,
"input-aborts": 0,
"seconds-since-last-clear-counters": 0,
"parity-packets-received": 0,
"output-underruns": 0,
"availability-flag": 0,
"throttled-packets-received": 0,
"crc-errors": 0,
"framing-errors-received": 0,
"broadcast-packets-received": 0,
"output-buffers-swapped-out": 0,
"applique": 0,
"broadcast-packets-sent": 0,
"input-drops": 0,
"output-drops": 0,
"resets": 0,
"input-ignored-packets": 0,
"output-buffer-failures": 0,
"output-queue-drops": 0,
"giant-packets-received": 0,
"input-errors": 0,
"last-data-time": 1563709791,
"multicast-packets-received": 0,
"output-errors": 0,
"input-overruns": 0,
"carrier-transitions": 0,
"unknown-protocol-packets-received": 0,
"bytes-received": 0
},
"Timestamp": 1563709831182,
"Keys": {
"interface-name": "HundredGigE0/0/1/2"
}
},
{
"Content": {
"last-discontinuity-time": 1563431445,
"packets-received": 0,
"runt-packets-received": 0,
"seconds-since-packet-sent": 4294967295,
"input-queue-drops": 0,
"multicast-packets-sent": 0,
"seconds-since-packet-received": 4294967295,
"bytes-sent": 0,
"packets-sent": 0,
"input-aborts": 0,
"seconds-since-last-clear-counters": 0,
"parity-packets-received": 0,
"output-underruns": 0,
"availability-flag": 0,
"throttled-packets-received": 0,
"crc-errors": 0,
"framing-errors-received": 0,
"broadcast-packets-received": 0,
"output-buffers-swapped-out": 0,
"applique": 0,
"broadcast-packets-sent": 0,
"input-drops": 0,
"output-drops": 0,
"resets": 0,
"input-ignored-packets": 0,
"output-buffer-failures": 0,
"output-queue-drops": 0,
"giant-packets-received": 0,
"input-errors": 0,
"last-data-time": 1563709791,
"multicast-packets-received": 0,
"output-errors": 0,
"input-overruns": 0,
"carrier-transitions": 0,
"unknown-protocol-packets-received": 0,
"bytes-received": 0
},
"Timestamp": 1563709831182,
"Keys": {
"interface-name": "HundredGigE0/0/1/1"
}
}
],
"Source": "172.16.200.60:23549"
},
"fields": {
"@timestamp": [
"2019-07-21T11:51:48.864Z"
]
},
"sort": [
1563709908864
]
}

Every time, the Rows.Timestamp and Telemetry.msg_timestamp always matched to number.
Could someone guide me how to match this data?

Thank you in advance!
Tosamon L.

Rows is an array, so you could use

date { match => [ "[Rows][0][Timestamp]", "UNIX_MS", "UNIX"] }

Thank you Badger, I have been tried [Rows][0][Timestamp] but it was not work.
In kibana is shown this "Objects in arrays are not well supported"

After tried.
date {
match => [ "[Rows][0][Timestamp]", "UNIX_MS"]
target => [ "[Rows][0][Timestamp]" ]
}
I got the error msg from logstash:
, {:_id=>nil, :_index=>"logstash", :_type=>"_doc", :routing=>nil}, #LogStash::Event:0x21098a8d], :response=>{"index"=>{"_index"=>"logstash-2019.07.23-000001", "_type"=>"_doc", "_id"=>"yTQwH2wBNWT9XzBB5_6g", "status"=>400, "error"=>{"type"=>"illegal_argument_exception", "reason"=>"mapper [Rows.Timestamp] of different type, current_type [long], merged_type [date]"}}}}

Need advice

OK, so it successfully parsed the field. The problem is you already have documents in your index where that field is a number like 1563709831182, and it cannot be a number on some documents and a date on others.

One option is to wait until tonight, when the index rolls over to logstash-2019.07.24-000001. At that point the message should disappear. Another option is to change the target option on the date filter.

In Rows field, there is the array of nested field as shown below:
There is the timestamp for each interface. After tried to target date to the [Rows][0][Timestamp] field , but the other fields such as [Rows][1][Timestamp] ... [Rows][n][Timestamp] are still the number. That might the case of logstash error different type.

Rows {
"Content": {
"output-drops": 0,
"resets": 0,
"input-queue-drops": 0
},
"Timestamp": 1563892066066,
"Keys": {
"interface-name": "Null0"
}
},
{
"Content": {
"output-drops": 0,
"resets": 0,
"input-queue-drops": 0
},
"Timestamp": 1563892066066,
"Keys": {
"interface-name": "MgmtEth0/RP0/CPU0/0"
}
},
{
"Content": {
"output-drops": 0,
"last-discontinuity-time": 1563431445,
"packets-sent": 6072,
"resets": 0,
"input-queue-drops": 0
},
"Timestamp": 1563892066066,
"Keys": {
"interface-name": "GigabitEthernet0/0/0/0"
}
},
{
"Content": {
"output-drops": 0,
"last-discontinuity-time": 1563431445,
"packets-sent": 20192179,
"resets": 0,
"input-queue-drops": 0
},
"Timestamp": 1563892066066,
"Keys": {
"interface-name": "GigabitEthernet0/0/0/23"
}
},
{
"Content": {
"output-drops": 0,
"last-discontinuity-time": 1563431445,
"packets-sent": 0,
"availability-flag": 0,
"resets": 0,
"input-queue-drops": 0
},
"Timestamp": 1563892066066,
"Keys": {
"interface-name": "GigabitEthernet0/0/0/15"
}
}
}

==
I tryied the following code, but it added the Rows.Keys.Timestamp only on the first field on the array.

date {
match => [ "[Rows][0][Timestamp]", "UNIX_MS"]
target => [ "[@Interfacetime]" ]
add_field => { "[Rows][0][Keys][Timestamp]" => "%{[@Interfacetime]}" }
}
How can we match all data in arrays of nested field ?

You will need to use ruby to convert the timestamp in every entry in an array.

    ruby {
        code => '
            r = event.get("[Rows]")
            r.each_index { |x|
                r[x]["Keys"]["Timestamp"] = LogStash::Timestamp.at(r[x]["Timestamp"].to_f/1000)
            }
            event.set("Rows", r)
        '
    }

I was surprised to see that 1563709831182 gets converted to 2019-07-21T11:50:31.181Z, which appears to me to be off by a millisecond.

Thank you so much!!! @Badger, This ruby code fixed my problem.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.