Not able to read time field from docker logs


(Shekhar Agrawal) #1

I have configured filebeat on my kubernetes cluster with elastic Search stack to collect all the generated dockers logs.

The logs string are in this format :

{"log":"Execution time (ms): 4\n","stream":"stdout","time":"2018-01-26T05:53:22.342585873Z"}

It appears that filebeat is able to parse these line for message and stream, but I believe somehow the timestamp is getting skipped. Due to which when I try to view the logs from elasticsearch/kibana I am seeing them in random order

Any pointers??


(Noémi Ványi) #2

Could you share your whole config and possibly debug logs of you beat (output of {beatname} -e -d "*")?


(Shekhar Agrawal) #3

I am deploying filebeat as kubernetes Pod using this yaml : curl -L -O https://raw.githubusercontent.com/elastic/beats/master/deploy/kubernetes/filebeat-kubernetes.yaml

I am not really sure, how can I capture these debug log, the log I see on the pod doesn't seems to any information about the metadata filebeat is exporting ?


(Carlos Pérez Aradros) #4

Hi @shekharupland,

The parsed timestamp should end up in the @timestamp field, is that what you are checking? Could you provide some examples of the correct timestamp in the file, and the wrong one in the final event?

Best regards


(Shekhar Agrawal) #5

Attaching one example for your reference

This is the log which I can see in my container log file
{"log":"\u003cLogger: 1517585831492 708FF189AA5A42EBA8A4DAF8B22DCA81 \u003e Function Data {"executionId":"708FF189AA5A42EBA8A4DAF8B22DCA81","functionName":"echo20180202005722","functionuuid":"035F299142AE4739BF3074BA56E4407B","bindings":{"resources":{}}} \u003c/Logger\u003e\n","stream":"stdout","time":"2018-02-02T15:37:11.49267025Z"}
{"log":"\u003cLogger: 1517585831492 708FF189AA5A42EBA8A4DAF8B22DCA81 \u003e Event Data {} \u003c/Logger\u003e\n","stream":"stdout","time":"2018-02-02T15:37:11.49268216Z"}
{"log":"\u003cLogger: 1517585831492 708FF189AA5A42EBA8A4DAF8B22DCA81 \u003e endianness LE \u003c/Logger\u003e\n","stream":"stdout","time":"2018-02-02T15:37:11.492685621Z"}
{"log":"\u003cLogger: 1517585831492 708FF189AA5A42EBA8A4DAF8B22DCA81 \u003e os type Linux \u003c/Logger\u003e\n","stream":"stdout","time":"2018-02-02T15:37:11.492844604Z"}
{"log":"\u003cLogger: 1517585831492 708FF189AA5A42EBA8A4DAF8B22DCA81 \u003e OS platform linux \u003c/Logger\u003e\n","stream":"stdout","time":"2018-02-02T15:37:11.492851647Z"}
{"log":"\u003cLogger: 1517585831492 708FF189AA5A42EBA8A4DAF8B22DCA81 \u003e Total memory : 7570403328 bytes. \u003c/Logger\u003e\n","stream":"stdout","time":"2018-02-02T15:37:11.493038568Z"}
{"log":"\u003cLogger: 1517585831492 708FF189AA5A42EBA8A4DAF8B22DCA81 \u003e Free memory 2682531840 bytes. \u003c/Logger\u003e\n","stream":"stdout","time":"2018-02-02T15:37:11.493048689Z"}
{"log":"\u003cLogger: 1517585831492 708FF189AA5A42EBA8A4DAF8B22DCA81 \u003e Change in the code. \u003c/Logger\u003e\n","stream":"stdout","time":"2018-02-02T15:37:11.493051886Z"}

{"log":"\u003cLogger: 1517585831492 708FF189AA5A42EBA8A4DAF8B22DCA81 \u003e Function Data {"executionId":"708FF189AA5A42EBA8A4DAF8B22DCA81","functionName":"echo20180202005722","functionuuid":"035F299142AE4739BF3074BA56E4407B","bindings":{"resources":{}}} \u003c/Logger\u003e\n","stream":"stdout","time":"2018-02-02T15:37:11.49267025Z"}
{"log":"\u003cLogger: 1517585831492 708FF189AA5A42EBA8A4DAF8B22DCA81 \u003e Event Data {} \u003c/Logger\u003e\n","stream":"stdout","time":"2018-02-02T15:37:11.49268216Z"}
{"log":"\u003cLogger: 1517585831492 708FF189AA5A42EBA8A4DAF8B22DCA81 \u003e endianness LE \u003c/Logger\u003e\n","stream":"stdout","time":"2018-02-02T15:37:11.492685621Z"}
{"log":"\u003cLogger: 1517585831492 708FF189AA5A42EBA8A4DAF8B22DCA81 \u003e os type Linux \u003c/Logger\u003e\n","stream":"stdout","time":"2018-02-02T15:37:11.492844604Z"}
{"log":"\u003cLogger: 1517585831492 708FF189AA5A42EBA8A4DAF8B22DCA81 \u003e OS platform linux \u003c/Logger\u003e\n","stream":"stdout","time":"2018-02-02T15:37:11.492851647Z"}
{"log":"\u003cLogger: 1517585831492 708FF189AA5A42EBA8A4DAF8B22DCA81 \u003e Total memory : 7570403328 bytes. \u003c/Logger\u003e\n","stream":"stdout","time":"2018-02-02T15:37:11.493038568Z"}
{"log":"\u003cLogger: 1517585831492 708FF189AA5A42EBA8A4DAF8B22DCA81 \u003e Free memory 2682531840 bytes. \u003c/Logger\u003e\n","stream":"stdout","time":"2018-02-02T15:37:11.493048689Z"}
{"log":"\u003cLogger: 1517585831492 708FF189AA5A42EBA8A4DAF8B22DCA81 \u003e Change in the code. \u003c/Logger\u003e\n","stream":"stdout","time":"2018-02-02T15:37:11.493051886Z"}

Below is the es response for the same criteria, the problem here is all my log statement are in random order. I think more than timestamp it's a precision which is causing the issue. Appecial any help.
{
"took": 14,
"timed_out": false,
"_shards": {
"total": 10,
"successful": 10,
"skipped": 0,
"failed": 0
},
"hits": {
"total": 11,
"max_score": 10.862274,
"hits": [
{
"_source": {
"@timestamp": "2018-02-02T15:37:11.492Z",
"message": "<Logger: 1517585831492 708FF189AA5A42EBA8A4DAF8B22DCA81 > Function Data {"executionId":"708FF189AA5A42EBA8A4DAF8B22DCA81","functionName":"echo20180202005722","functionuuid":"035F299142AE4739BF3074BA56E4407B","bindings":{"resources":{}}} ",
}
},
{
"_source": {
"@timestamp": "2018-02-02T15:37:11.492Z",
"message": "<Logger: 1517585831492 708FF189AA5A42EBA8A4DAF8B22DCA81 > os type Linux ",
}
},
{
"_source": {
"@timestamp": "2018-02-02T15:37:11.493Z",
"message": "<Logger: 1517585831492 708FF189AA5A42EBA8A4DAF8B22DCA81 > Total memory : 7570403328 bytes. ",
}
},
{
"_source": {
"@timestamp": "2018-02-02T15:37:11.492Z",
"message": "<Logger: 1517585831492 708FF189AA5A42EBA8A4DAF8B22DCA81 > endianness LE ",
}
},
{
"_source": {
"@timestamp": "2018-02-02T15:37:11.493Z",
"message": "<Logger: 1517585831492 708FF189AA5A42EBA8A4DAF8B22DCA81 > Change in the code. ",
}
},
{
"_source": {
"@timestamp": "2018-02-02T15:37:11.492Z",
"message": "<Logger: 1517585831492 708FF189AA5A42EBA8A4DAF8B22DCA81 > OS platform linux "
}
},
{
"_source": {
"@timestamp": "2018-02-02T15:37:11.487Z",
"message": "This is a execution handler for execution Id 708FF189AA5A42EBA8A4DAF8B22DCA81 oracle-functions-node-6f4895fdf9-7hvjc",
}
},
{
"_source": {
"@timestamp": "2018-02-02T15:37:11.492Z",
"message": "<Logger: 1517585831492 708FF189AA5A42EBA8A4DAF8B22DCA81 > Event Data {} ",
}
},
{
"_source": {
"@timestamp": "2018-02-02T15:37:11.493Z",
"message": "<Logger: 1517585831492 708FF189AA5A42EBA8A4DAF8B22DCA81 > Free memory 2682531840 bytes. ",
}
},
{
"_source": {
"@timestamp": "2018-02-02T15:37:11.493Z",
"message": "<Logger: 1517585831493 708FF189AA5A42EBA8A4DAF8B22DCA81 > Execution time (ms): 119999998 ",
}
}
]
}
}


(Carlos Pérez Aradros) #6

Documents look good, I'm wondering, did you use sort in your query? https://www.elastic.co/guide/en/elasticsearch/reference/6.1/search-request-sort.html


(Shekhar Agrawal) #7

Yes I have used sort of @timestamp. But I think the problem is log differ only in nano second time. I think elasticsearch is not able to distinguish it.
I was thinking if somehow I can import the time field itself in the elasticsearch may be for a given window I can sort the result myself using that additional attribute. But seems like when filebeat write to elasticsearch it only carry log and stream but skip the time from docker logs.


(Carlos Pérez Aradros) #8

Understood, I think you are hitting this issue: https://github.com/elastic/elasticsearch/issues/10005, there has been some recent activity on it


(system) #9

This topic was automatically closed after 21 days. New replies are no longer allowed.