Upload csv fle in elasticsearch using filebeat

i'm trying to load csv file in Elasticsearch using filebeat
here is my filebeat.yml

filebeat.inputs:
#- type: log
- type: stdin
  setup.template.overwrite: true
  enabled: true
  close_eof: true
  paths: 
    - /usr/share/filebeat/dockerlogs/*.csv


  processors:
  - decode_csv_fields:
     fields:
       message: "message"
     separator: ","
     ignore_missing: false
     overwrite_keys: true
     trim_leading_space: true
     fail_on_error: true
  - drop_fields:
      fields: [ "log", "host", "ecs", "input", "agent" ]
  - extract_array:
      field: message
      mappings:
          sr: 0
          Identifiant PSI: 1
          libellé PSI: 2
          Identifiant PdR: 3
          T3 Date Prévisionnelle: 4
          DS Reporting PdR: 5
          Status PSI: 6
          Type PdR: 7
  - drop_fields:
        fields: ["message","sr"]

               
  #index: rapport_g035_prov_1
   
filebeat.registry.path: /usr/share/filebeat/data/registry/filebeat/filebeat

output:
  elasticsearch:
    enabled: true
    hosts: ["IPAdress:8081"]
    indices:
      - index: "rapport_g035_prov"
      #- index: "filebeat-%{[agent.version]}-%{+yyyy.MM.dd}"
      #- index: "filebeat-7.7.0"
#setup.dashboards.kibana_index: file-*
seccomp.enabled: false
logging.metrics.enabled: false

but when i chek the index in kibana i found that the index can't read columnes names

i tried to process the csv if filebeat.yml in another way

filebeat.inputs:
- type: log
  setup.template.overwrite: true
  enabled: true
  close_eof: true
  paths: 
    - /usr/share/filebeat/dockerlogs/*.csv


  processors:
  - decode_csv_fields:
      fields:
        message: decoded.csv
      separator: ","
      ignore_missing: false
      overwrite_keys: true
      trim_leading_space: false
      fail_on_error: true

               
  #index: rapport_g035_prov_1
   
filebeat.registry.path: /usr/share/filebeat/data/registry/filebeat/filebeat

output:
  elasticsearch:
    enabled: true
    hosts: ["IPAdress:8081"]
    indices:
      - index: "rapport_g035_prov"
      #- index: "filebeat-%{[agent.version]}-%{+yyyy.MM.dd}"
      #- index: "filebeat-7.7.0"
#setup.dashboards.kibana_index: file-*
seccomp.enabled: false
logging.metrics.enabled: false

but i got the same error it cant map the index correctli i know ther is a problem in the proccessing of csv in the filebeat.yml but i dont know what is it

have you tried to process the CSV with an ingest pipeline? That's the preferred method to shape your ingested data since you can centralize the business rules to produce elasticsearch documents inside the very same database and the filebeat component just sends the raw lines to ES.

Take a look to this excellent video explaining how to create the pipeline from the ES stack management interface

hope it helps!

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.