I am sending csv data from filebeat to elastic search following this guide:
I have created an ingest pipeline, tested it with the _simulate api, and everything looks fine.
This is my filebeat configuration:
filebeat.yml: |
filebeat.inputs:
- type: log
enabled: true
paths:
- /test/*.csv
exclude_lines: ['^StartTimeMs']
output.elasticsearch:
enabled: true
hosts: ["{{ .Values.filebeat.elasticsearch.connection.url }}"]
path: "{{ .Values.filebeat.elasticsearch.connection.path }}"
username: {{ .Values.filebeat.elasticsearch.credentials.username }}
password: {{ .Values.filebeat.elasticsearch.credentials.password }}
indices:
-index: "binaryload"
pipeline: "binaryload-statistics"
logging.metrics.enabled: false
logging.level: debug
In the filebeat logs I see where the event is published:
2021-02-08T13:37:17.062Z DEBUG [processors] processing/processors.go:187 Publish event: {
"@timestamp": "2021-02-08T13:37:17.062Z",
"@metadata": {
"beat": "filebeat",
"type": "_doc",
"version": "7.8.0"
},
"ecs": {
"version": "1.5.0"
},
"host": {
"name": "trafficgen-binaryload-g2kg9"
},
"agent": {
"id": "69cfab42-7e54-4ad3-a841-8d1098d8b0af",
"name": "trafficgen-binaryload-g2kg9",
"type": "filebeat",
"version": "7.8.0",
"hostname": "trafficgen-binaryload-g2kg9",
"ephemeral_id": "e4e5f316-bd0f-4273-8487-5595b761c6cb"
},
"log": {
"file": {
"path": "/test/roundtrip_latency_GTP.csv"
},
"offset": 92699
},
"message": "1612791435943,1612791436942,999,\"h:10.0.20.40|i:1|m:0\",\"mainmessage\",0,858,0,0,0,0,0,0,0,205920,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0",
"input": {
"type": "log"
}
}
On the elastic side, I don't see any errors in the logs, but also I don't see either any new indices created, or any existing indices updated. I can't figure out what is going wrong.
I have tried manually creating an index on elastic called 'binaryload' but it doesn't get updated either.