Filebeat 6.2.3 in Kubernetes : Read line error: decoding docker JSON

Hi,

I have deployed Filebeat 6.2.3 in Kubernetes. All works great during few hours but something went wrong after a while.
Logs of my ingress-nginx-controller are not indexed anymore, filebeat reports this error :
2018-05-23T06:15:30.508Z ERROR log/harvester.go:243 Read line error: decoding docker JSON: json: cannot unmarshal number into Go value of type reader.dockerLog; File: %!(EXTRA string=/var/lib/docker/containers/f6fde94852bb9b1dcfea85a0f4b2d813c7e0b062fcefce22b4261257d97f3321/f6fde94852bb9b1dcfea85a0f4b2d813c7e0b062fcefce22b4261257d97f3321-json.log)

The last indexed line was this one :
{"log":"95.213.177.125 - [95.213.177.125] - - [22/May/2018:16:10:23 +0000] \"CONNECT check.proxyradar.com:80 HTTP/1.1\" 400 174 \"-\" \"-\" 0 0.256 [] - - - -\n","stream":"stdout","time":"2018-05-22T16:10:23.210343159Z"}

The next line in the log is this one :
{"log":"00.000.000.00 - [00.000.000.00] - - [22/May/2018:16:47:08 +0000] \"POST /elasticsearch/_msearch HTTP/1.1\" 403 197 \"http://kibana.mydomain.com/app/kibana\" \"Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Ubuntu Chromium/64.0.3282.167 Chrome/64.0.3282.167 Safari/537.36\" 543 0.000 [monitoring-kibana-5601] - - - -\n","stream":"stdout","time":"2018-05-22T16:47:08.279680383Z"}

The filebeat configuration for this container is as follow :

autodiscover:
  providers:
    - condition:
        contains:
          docker.container.image: "nginx-ingress-controller"
        config:
          - module: nginx
              access:
                prospector:
                  type: docker
                  containers.stream: stdout
                  containers.ids:
                    - "${data.docker.container.id}"
                  processors: 
                    - add_kubernetes_metadata: 
                        in_cluster: true 
              error:
                  prospector:
                    type: docker
                    containers.stream: stderr
                    containers.ids:
                      - "${data.docker.container.id}"
                    processors: 
                      - add_kubernetes_metadata: 
                          in_cluster: true

Do you have any idea why my logs are not parsed correctly anymore ?

Thank you very much.

I just deleted the pods and since a new one was started, the logs are correctly ingested...

Great! Let us know if the problem show up again.

The error shows up again !

2018-05-23T10:39:55.491Z        INFO    log/harvester.go:216    Harvester started for file: /var/lib/docker/containers/f6fde94852bb9b1dcfea85a0f4b2d813c7e0b062fcefce22b4261257d97f3321/f6fde94852bb9b1dcfea85a0f4b2d813c7e0b062fcefce22b42612
57d97f3321-json.log
2018-05-23T10:39:55.491Z        ERROR   log/harvester.go:243    Read line error: decoding docker JSON: json: cannot unmarshal number into Go value of type reader.dockerLog; File: %!(EXTRA string=/var/lib/docker/containers/f6fde94852bb9b1d
cfea85a0f4b2d813c7e0b062fcefce22b4261257d97f3321/f6fde94852bb9b1dcfea85a0f4b2d813c7e0b062fcefce22b4261257d97f3321-json.log)
2018-05-23T10:39:55.491Z        ERROR   log/harvester.go:243    Read line error: decoding docker JSON: invalid character 'y' looking for beginning of value; File: %!(EXTRA string=/var/lib/docker/containers/f6fde94852bb9b1dcfea85a0f4b2d813
c7e0b062fcefce22b4261257d97f3321/f6fde94852bb9b1dcfea85a0f4b2d813c7e0b062fcefce22b4261257d97f3321-json.log)

Last indexed line is :
{"log":"5.8.18.92 - [5.8.18.92] - - [23/May/2018:10:34:44 +0000] \"\\x03\\x00\\x00+\u0026\\xE0\\x00\\x00\\x00\\x00\\x00Cookie: mstshash=hello\" 400 174 \"-\" \"-\" 0 0.094 [] - - - -\n","stream":"stdout","time":"2018-05-23T10:34:44.014458055Z"}

Next line is :
{"log":"00.000.00.000 - [00.000.00.000] - - [23/May/2018:10:46:01 +0000] \"GET /subpath/version-mobile HTTP/2.0\" 401 51 \"-\" \"Mozilla/5.0 (Linux; Android 6.0; MYA-L11 Build/HUAWEIMYA-L11; wv) AppleWebKit/537.36 (KHTML, like Gecko) Version/4.0 Chrome/66.0.3359.158 Mobile Safari/537.36\" 1385 0.006 [default-myservice-8080] 10.32.0.31:8080 51 0.008 401\n","stream":"stdout","time":"2018-05-23T10:46:01.463045569Z"}

Could you please share the contents of this file: /var/lib/docker/containers/f6fde94852bb9b1d cfea85a0f4b2d813c7e0b062fcefce22b4261257d97f3321/f6fde94852bb9b1dcfea85a0f4b2d813c7e0b062fcefce22b4261257d97f3321-json.log? I'm only interested on the lines around this one: 5.8.18.92 - [5.8.18.92] - - [23/May/2018:10:34:44 +0000]...

Here are the lines around the last indexed one :

{"log":"00.000.000.000 - [00.000.000.000] - - [23/May/2018:10:31:25 +0000] \"POST /api/metrics/vis/data HTTP/1.1\" 200 486 \"http://kibana.mydomain.com/app/kibana\" \"Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Ubuntu Chromium/66.0.3359.139 Chrome/66.0.3359.139 Safari/537.36\" 1828 0.020 [monitoring-kibana-5601] 10.32.0.17:5601 486 0.020 200\n","stream":"stdout","time":"2018-05-23T10:31:25.790025711Z"}
{"log":"00.000.000.000 - [00.000.000.000] - - [23/May/2018:10:31:25 +0000] \"GET /api/metrics/fields?index=filebeat-* HTTP/1.1\" 200 37 \"http://kibana.mydomain.com/app/kibana\" \"Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Ubuntu Chromium/66.0.3359.139 Chrome/66.0.3359.139 Safari/537.36\" 458 0.004 [monitoring-kibana-5601] 10.32.0.17:5601 37 0.004 200\n","stream":"stdout","time":"2018-05-23T10:31:25.981819376Z"}
{"log":"00.000.000.000 - [00.000.000.000] - - [23/May/2018:10:31:27 +0000] \"GET /ui/favicons/favicon-32x32.png HTTP/1.1\" 304 0 \"http://kibana.mydomain.com/app/kibana\" \"Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Ubuntu Chromium/66.0.3359.139 Chrome/66.0.3359.139 Safari/537.36\" 547 0.002 [monitoring-kibana-5601] 10.32.0.17:5601 0 0.004 304\n","stream":"stdout","time":"2018-05-23T10:31:27.847171961Z"}
{"log":"00.000.000.000 - [00.000.000.000] - - [23/May/2018:10:31:28 +0000] \"GET /ui/favicons/favicon-16x16.png HTTP/1.1\" 304 0 \"http://kibana.mydomain.com/app/kibana\" \"Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Ubuntu Chromium/66.0.3359.139 Chrome/66.0.3359.139 Safari/537.36\" 547 0.002 [monitoring-kibana-5601] 10.32.0.17:5601 0 0.004 304\n","stream":"stdout","time":"2018-05-23T10:31:28.024105236Z"}
{"log":"00.000.000.000 - [00.000.000.000] - - [23/May/2018:10:31:28 +0000] \"POST /api/metrics/vis/data HTTP/1.1\" 200 751 \"http://kibana.mydomain.com/app/kibana\" \"Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Ubuntu Chromium/66.0.3359.139 Chrome/66.0.3359.139 Safari/537.36\" 1828 0.100 [monitoring-kibana-5601] 10.32.0.17:5601 751 0.100 200\n","stream":"stdout","time":"2018-05-23T10:31:28.037632638Z"}
{"log":"00.000.000.000 - [00.000.000.000] - - [23/May/2018:10:31:28 +0000] \"GET /api/metrics/fields?index=filebeat-* HTTP/1.1\" 200 37 \"http://kibana.mydomain.com/app/kibana\" \"Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Ubuntu Chromium/66.0.3359.139 Chrome/66.0.3359.139 Safari/537.36\" 458 0.005 [monitoring-kibana-5601] 10.32.0.17:5601 37 0.008 200\n","stream":"stdout","time":"2018-05-23T10:31:28.218912783Z"}
{"log":"00.000.000.000 - [00.000.000.000] - - [23/May/2018:10:32:45 +0000] \"POST /elasticsearch/_msearch HTTP/1.1\" 200 22431 \"http://kibana.mydomain.com/app/kibana\" \"Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Ubuntu Chromium/66.0.3359.139 Chrome/66.0.3359.139 Safari/537.36\" 1356 0.634 [monitoring-kibana-5601] 10.32.0.17:5601 22403 0.632 200\n","stream":"stdout","time":"2018-05-23T10:32:45.885468176Z"}
{"log":"5.8.18.92 - [5.8.18.92] - - [23/May/2018:10:34:44 +0000] \"\\x03\\x00\\x00+\u0026\\xE0\\x00\\x00\\x00\\x00\\x00Cookie: mstshash=hello\" 400 174 \"-\" \"-\" 0 0.094 [] - - - -\n","stream":"stdout","time":"2018-05-23T10:34:44.014458055Z"}
{"log":"00.000.00.000 - [00.000.00.000] - - [23/May/2018:10:46:01 +0000] \"GET /app-ws/version-mobile HTTP/2.0\" 401 51 \"-\" \"Mozilla/5.0 (Linux; Android 6.0; MYA-L11 Build/HUAWEIMYA-L11; wv) AppleWebKit/537.36 (KHTML, like Gecko) Version/4.0 Chrome/66.0.3359.158 Mobile Safari/537.36\" 1385 0.006 [default-app-service-8080] 10.32.0.31:8080 51 0.008 401\n","stream":"stdout","time":"2018-05-23T10:46:01.463045569Z"}
{"log":"00.000.00.000 - [00.000.00.000] - - [23/May/2018:10:46:02 +0000] \"POST /app-ws/refresh-token HTTP/2.0\" 200 1401 \"-\" \"Mozilla/5.0 (Linux; Android 6.0; MYA-L11 Build/HUAWEIMYA-L11; wv) AppleWebKit/537.36 (KHTML, like Gecko) Version/4.0 Chrome/66.0.3359.158 Mobile Safari/537.36\" 1198 0.532 [default-app-service-8080] 10.32.0.31:8080 2481 0.044 200\n","stream":"stdout","time":"2018-05-23T10:46:02.164214622Z"}
{"log":"00.000.00.000 - [00.000.00.000] - - [23/May/2018:10:46:02 +0000] \"GET /app-ws/version-mobile HTTP/2.0\" 200 5 \"-\" \"Mozilla/5.0 (Linux; Android 6.0; MYA-L11 Build/HUAWEIMYA-L11; wv) AppleWebKit/537.36 (KHTML, like Gecko) Version/4.0 Chrome/66.0.3359.158 Mobile Safari/537.36\" 1116 0.005 [default-app-service-8080] 10.32.0.31:8080 5 0.004 200\n","stream":"stdout","time":"2018-05-23T10:46:02.986110301Z"}
{"log":"00.000.00.000 - [00.000.00.000] - - [23/May/2018:10:46:03 +0000] \"GET /app-ws/version-mobile HTTP/2.0\" 200 5 \"-\" \"Mozilla/5.0 (Linux; Android 6.0; MYA-L11 Build/HUAWEIMYA-L11; wv) AppleWebKit/537.36 (KHTML, like Gecko) Version/4.0 Chrome/66.0.3359.158 Mobile Safari/537.36\" 39 0.004 [default-app-service-8080] 10.32.0.31:8080 5 0.004 200\n","stream":"stdout","time":"2018-05-23T10:46:03.483461454Z"}
{"log":"00.000.00.000 - [00.000.00.000] - - [23/May/2018:10:46:03 +0000] \"GET /app-sg/data/ HTTP/2.0\" 200 183 \"-\" \"Mozilla/5.0 (Linux; Android 6.0; MYA-L11 Build/HUAWEIMYA-L11; wv) AppleWebKit/537.36 (KHTML, like Gecko) Version/4.0 Chrome/66.0.3359.158 Mobile Safari/537.36\" 124 0.004 [default-sync-gateway-service-4984] 10.32.1.28:4984 183 0.004 200\n","stream":"stdout","time":"2018-05-23T10:46:03.790551672Z"}
{"log":"00.000.00.000 - [00.000.00.000] - - [23/May/2018:10:46:04 +0000] \"GET /app-sg/data/ HTTP/2.0\" 200 183 \"-\" \"Mozilla/5.0 (Linux; Android 6.0; MYA-L11 Build/HUAWEIMYA-L11; wv) AppleWebKit/537.36 (KHTML, like Gecko) Version/4.0 Chrome/66.0.3359.158 Mobile Safari/537.36\" 69 0.003 [default-sync-gateway-service-4984] 10.32.1.27:4984 183 0.004 200\n","stream":"stdout","time":"2018-05-23T10:46:04.092312269Z"}

I received another error :

2018-05-23T12:51:57.594Z        ERROR   log/harvester.go:243    Read line error: decoding docker JSON: invalid character 'T' looking for beginning of value; File: %!(EXTRA string=/var/lib/docker/containers/f6fde94852bb9b1dcfea85a0f4b2d813c7e0b062fcefce22b4261257d97f3321/f6fde94852bb9b1dcfea85a0f4b2d813c7e0b062fcefce22b4261257d97f3321-json.log)

For these lines (the last correctly indexed is the 4th) :

{"log":"00.000.000.000 - [00.000.000.000] - - [23/May/2018:12:31:06 +0000] \"GET /ui/favicons/favicon-32x32.png HTTP/1.1\" 304 0 \"http://kibana.mydomain.com/app/kibana\" \"Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Ubuntu Chromium/66.0.3359.139 Chrome/66.0.3359.139 Safari/537.36\" 547 0.006 [monitoring-kibana-5601] 10.32.0.17:5601 0 0.008 304\n","stream":"stdout","time":"2018-05-23T12:31:06.684417017Z"}
{"log":"00.000.000.000 - [00.000.000.000] - - [23/May/2018:12:31:07 +0000] \"GET /ui/favicons/favicon-16x16.png HTTP/1.1\" 304 0 \"http://kibana.mydomain.com/app/kibana\" \"Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Ubuntu Chromium/66.0.3359.139 Chrome/66.0.3359.139 Safari/537.36\" 547 0.003 [monitoring-kibana-5601] 10.32.0.17:5601 0 0.004 304\n","stream":"stdout","time":"2018-05-23T12:31:07.072998177Z"}
{"log":"00.000.000.000 - [00.000.000.000] - - [23/May/2018:12:31:07 +0000] \"POST /elasticsearch/_msearch HTTP/1.1\" 200 69534 \"http://kibana.mydomain.com/app/kibana\" \"Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Ubuntu Chromium/66.0.3359.139 Chrome/66.0.3359.139 Safari/537.36\" 1255 0.778 [monitoring-kibana-5601] 10.32.0.17:5601 69443 0.780 200\n","stream":"stdout","time":"2018-05-23T12:31:07.883790922Z"}
{"log":"00.000.000.000 - [00.000.000.000] - - [23/May/2018:12:31:15 +0000] \"POST /elasticsearch/_msearch HTTP/1.1\" 200 23860 \"http://kibana.mydomain.com/app/kibana\" \"Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Ubuntu Chromium/66.0.3359.139 Chrome/66.0.3359.139 Safari/537.36\" 1588 0.872 [monitoring-kibana-5601] 10.32.0.17:5601 23832 0.868 200\n","stream":"stdout","time":"2018-05-23T12:31:15.280776281Z"}
{"log":"00.000.000.000 - [00.000.000.000] - - [23/May/2018:12:46:49 +0000] \"GET /grafana HTTP/1.1\" 404 3558 \"http://devtools.mydomain.com/confluence/display/PROJECT/My+Page\" \"Mozilla/5.0 (Windows NT 6.1; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/66.0.3359.181 Safari/537.36\" 650 0.017 [monitoring-grafana-3000] 10.32.2.15:3000 14545 0.016 404\n","stream":"stdout","time":"2018-05-23T12:46:49.708948048Z"}
{"log":"00.000.000.000 - [00.000.000.000] - - [23/May/2018:12:46:49 +0000] \"GET /public/build/grafana.dark.css?v5.0.4 HTTP/1.1\" 304 0 \"http://grafana.mydomain.com/grafana\" \"Mozilla/5.0 (Windows NT 6.1; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/66.0.3359.181 Safari/537.36\" 596 0.002 [monitoring-grafana-3000] 10.32.2.15:3000 0 0.000 304\n","stream":"stdout","time":"2018-05-23T12:46:49.997672894Z"}

Each time, the recreation of the pods reindex correctly the file.
The only repeating scenario I see is that each time an error occurs, the response code is not 200.

This looks pretty similar to this open issue: https://github.com/elastic/beats/issues/7045

So far I had no luck trying to reproduce it :frowning: will keep trying

Best regards

If it can help, here is the harvester debug log when error occured. Is there something else which can help for investigation ?

2018-05-24T08:53:48.283Z	INFO	log/harvester.go:216	Harvester started for file: /var/lib/docker/containers/f6fde94852bb9b1dcfea85a0f4b2d813c7e0b062fcefce22b4261257d97f3321/f6fde94852bb9b1dcfea85a0f4b2d813c7e0b062fcefce22b4261257d97f3321-json.log
2018-05-24T08:53:48.283Z	ERROR	log/harvester.go:243	Read line error: decoding docker JSON: invalid character 'o' looking for beginning of value; File: %!(EXTRA string=/var/lib/docker/containers/f6fde94852bb9b1dcfea85a0f4b2d813c7e0b062fcefce22b4261257d97f3321/f6fde94852bb9b1dcfea85a0f4b2d813c7e0b062fcefce22b4261257d97f3321-json.log)
2018-05-24T08:53:48.283Z	DEBUG	[harvester]	log/harvester.go:468	Stopping harvester for file: /var/lib/docker/containers/f6fde94852bb9b1dcfea85a0f4b2d813c7e0b062fcefce22b4261257d97f3321/f6fde94852bb9b1dcfea85a0f4b2d813c7e0b062fcefce22b4261257d97f3321-json.log
2018-05-24T08:53:48.283Z	DEBUG	[harvester]	log/harvester.go:478	Closing file: /var/lib/docker/containers/f6fde94852bb9b1dcfea85a0f4b2d813c7e0b062fcefce22b4261257d97f3321/f6fde94852bb9b1dcfea85a0f4b2d813c7e0b062fcefce22b4261257d97f3321-json.log
2018-05-24T08:53:48.283Z	DEBUG	[harvester]	log/harvester.go:348	Update state: /var/lib/docker/containers/f6fde94852bb9b1dcfea85a0f4b2d813c7e0b062fcefce22b4261257d97f3321/f6fde94852bb9b1dcfea85a0f4b2d813c7e0b062fcefce22b4261257d97f3321-json.log, offset: 451348
2018-05-24T08:53:48.283Z	DEBUG	[harvester]	log/harvester.go:489	harvester cleanup finished for file: /var/lib/docker/containers/f6fde94852bb9b1dcfea85a0f4b2d813c7e0b062fcefce22b4261257d97f3321/f6fde94852bb9b1dcfea85a0f4b2d813c7e0b062fcefce22b4261257d97f3321-json.log
2018-05-24T08:53:48.283Z	INFO	log/harvester.go:216	Harvester started for file: /var/lib/docker/containers/f6fde94852bb9b1dcfea85a0f4b2d813c7e0b062fcefce22b4261257d97f3321/f6fde94852bb9b1dcfea85a0f4b2d813c7e0b062fcefce22b4261257d97f3321-json.log
2018-05-24T08:53:48.283Z	ERROR	log/harvester.go:243	Read line error: decoding docker JSON: json: cannot unmarshal number into Go value of type reader.dockerLog; File: %!(EXTRA string=/var/lib/docker/containers/f6fde94852bb9b1dcfea85a0f4b2d813c7e0b062fcefce22b4261257d97f3321/f6fde94852bb9b1dcfea85a0f4b2d813c7e0b062fcefce22b4261257d97f3321-json.log)
2018-05-24T08:53:48.283Z	DEBUG	[harvester]	log/harvester.go:468	Stopping harvester for file: /var/lib/docker/containers/f6fde94852bb9b1dcfea85a0f4b2d813c7e0b062fcefce22b4261257d97f3321/f6fde94852bb9b1dcfea85a0f4b2d813c7e0b062fcefce22b4261257d97f3321-json.log
2018-05-24T08:53:48.283Z	DEBUG	[harvester]	log/harvester.go:478	Closing file: /var/lib/docker/containers/f6fde94852bb9b1dcfea85a0f4b2d813c7e0b062fcefce22b4261257d97f3321/f6fde94852bb9b1dcfea85a0f4b2d813c7e0b062fcefce22b4261257d97f3321-json.log
2018-05-24T08:53:48.283Z	DEBUG	[harvester]	log/harvester.go:348	Update state: /var/lib/docker/containers/f6fde94852bb9b1dcfea85a0f4b2d813c7e0b062fcefce22b4261257d97f3321/f6fde94852bb9b1dcfea85a0f4b2d813c7e0b062fcefce22b4261257d97f3321-json.log, offset: 6618202
2018-05-24T08:53:48.283Z	DEBUG	[harvester]	log/harvester.go:489	harvester cleanup finished for file: /var/lib/docker/containers/f6fde94852bb9b1dcfea85a0f4b2d813c7e0b062fcefce22b4261257d97f3321/f6fde94852bb9b1dcfea85a0f4b2d813c7e0b062fcefce22b4261257d97f3321-json.log

After the creation of a new pod, it index correctly the logs from the start, here is the debug log :

2018-05-24T08:58:02.232Z	DEBUG	[harvester]	log/harvester.go:447	Setting offset for file based on seek: /var/lib/docker/containers/f6fde94852bb9b1dcfea85a0f4b2d813c7e0b062fcefce22b4261257d97f3321/f6fde94852bb9b1dcfea85a0f4b2d813c7e0b062fcefce22b4261257d97f3321-json.log
2018-05-24T08:58:02.232Z	DEBUG	[harvester]	log/harvester.go:433	Setting offset for file: /var/lib/docker/containers/f6fde94852bb9b1dcfea85a0f4b2d813c7e0b062fcefce22b4261257d97f3321/f6fde94852bb9b1dcfea85a0f4b2d813c7e0b062fcefce22b4261257d97f3321-json.log. Offset: 0 
2018-05-24T08:58:02.232Z	DEBUG	[harvester]	log/harvester.go:348	Update state: /var/lib/docker/containers/f6fde94852bb9b1dcfea85a0f4b2d813c7e0b062fcefce22b4261257d97f3321/f6fde94852bb9b1dcfea85a0f4b2d813c7e0b062fcefce22b4261257d97f3321-json.log, offset: 0
2018-05-24T08:58:02.232Z	DEBUG	[harvester]	log/harvester.go:447	Setting offset for file based on seek: /var/lib/docker/containers/f6fde94852bb9b1dcfea85a0f4b2d813c7e0b062fcefce22b4261257d97f3321/f6fde94852bb9b1dcfea85a0f4b2d813c7e0b062fcefce22b4261257d97f3321-json.log
2018-05-24T08:58:02.232Z	DEBUG	[harvester]	log/harvester.go:433	Setting offset for file: /var/lib/docker/containers/f6fde94852bb9b1dcfea85a0f4b2d813c7e0b062fcefce22b4261257d97f3321/f6fde94852bb9b1dcfea85a0f4b2d813c7e0b062fcefce22b4261257d97f3321-json.log. Offset: 0 
2018-05-24T08:58:02.232Z	DEBUG	[harvester]	log/harvester.go:348	Update state: /var/lib/docker/containers/f6fde94852bb9b1dcfea85a0f4b2d813c7e0b062fcefce22b4261257d97f3321/f6fde94852bb9b1dcfea85a0f4b2d813c7e0b062fcefce22b4261257d97f3321-json.log, offset: 0
2018-05-24T08:58:02.232Z	INFO	log/harvester.go:216	Harvester started for file: /var/lib/docker/containers/f6fde94852bb9b1dcfea85a0f4b2d813c7e0b062fcefce22b4261257d97f3321/f6fde94852bb9b1dcfea85a0f4b2d813c7e0b062fcefce22b4261257d97f3321-json.log
2018-05-24T08:58:02.232Z	INFO	log/harvester.go:216	Harvester started for file: /var/lib/docker/containers/f6fde94852bb9b1dcfea85a0f4b2d813c7e0b062fcefce22b4261257d97f3321/f6fde94852bb9b1dcfea85a0f4b2d813c7e0b062fcefce22b4261257d97f3321-json.log
2018-05-24T08:58:20.355Z	DEBUG	[harvester]	log/log.go:85	End of file reached: /var/lib/docker/containers/f6fde94852bb9b1dcfea85a0f4b2d813c7e0b062fcefce22b4261257d97f3321/f6fde94852bb9b1dcfea85a0f4b2d813c7e0b062fcefce22b4261257d97f3321-json.log; Backoff now.

For your information, I've just upgraded to filebeat 6.2.4, same error occurs.

I'm getting the same error with 6.2.4.

... and it went away on restarting the Filebeat pods (with a configuration change, but only to do with the output).

There is a tentative fix in progress : https://github.com/elastic/beats/pull/7281

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.