Upload CSV File to Kibana Dashboard

Hi Team,

I need help on below two points while uploading csv file through kibana dashboard.

  1. How to upload a csv file size of more than 100MB through the kibana dashboard.
  2. How to upload multiple csv files to same indices.

Thanks,
Debasis

thats not possible thru kibana UI, but you can use various methods to get the data indexed to elasticsearch.

  • beats: you can use filebeat to monitor your folder for files and automatically index them, here is a sample filebeats config
filebeat.inputs:
- type: log
  enabled: true
  paths:
    - /path/to/your/csv/files/*.csv

output.elasticsearch:
  hosts: ["http://your-elasticsearch-host:9200"]
  index: "your_index_name"

you can use es bulk api to index the documents, and maybe a simple python script to do the processing for you

from elasticsearch import Elasticsearch
from elasticsearch.helpers import bulk

es = Elasticsearch(['http://localhost:9200'])  # Replace with your Elasticsearch URL

# Read your CSV file and create a list of dictionaries for each row
# For example, you can use Python's CSV module for this.
data = [{"field1": value1, "field2": value2, ...}, {...}, ...]

# Index the data into Elasticsearch
bulk(es, data, index='your_index_name', doc_type='your_doc_type')
1 Like

Thanks @ppisljar for your response to my first question. Could you please help me with the second requirement? For example, I am uploading a text.csv to the index "sample" and next want one more csv file to the same index "sample". Is there any way to achieve this?

Thanks,
Debasis

Hi @ppisljar,

Do you have any reference blog where we can get how filebeats can be configured such way to upload csv files to the particular index.

Thanks,
Debasis

here is something i found on the web: Load CSV data to ElasticSearch using FileBeat

Thanks @ppisljar. I had created the ingest pipelines but where to check if it is successful or failed because I could not see the data in the corresponding index. Is there any way to check this?

Thanks,
Debasis

logs should be in /var/log/filebeat/filebeat

Below is my unit file /usr/lib/systemd/system/filebeat.service and when I check under /var/log there is no filebeat folder as I mentioned above is there any way to check if ingest pipeline is working or in failed state.

Preformatted text> UMask=0027
> Environment="GODEBUG='madvdontneed=1'"
> Environment="BEAT_LOG_OPTS="
> Environment="BEAT_CONFIG_OPTS=-c /etc/filebeat/filebeat.yml"
> Environment="BEAT_PATH_OPTS=--path.home /usr/share/filebeat --path.config /etc/filebeat --path.data /var/lib/filebeat --path.logs /var/log/filebeat"
> ExecStart=/usr/share/filebeat/bin/filebeat --environment systemd $BEAT_LOG_OPTS $BEAT_CONFIG_OPTS $BEAT_PATH_OPTS
> Restart=always

Thanks,
Debasis

can you confirm filebeat is running ?

try to follow this tutorial to get it running: Filebeat quick start: installation and configuration | Filebeat Reference [8.10] | Elastic

Hi
Yes, my filebeat is up and running.

[root@cb-1 ~]# systemctl status filebeat
● filebeat.service - Filebeat sends log files to Logstash or directly to Elasticsearch.
   Loaded: loaded (/usr/lib/systemd/system/filebeat.service; disabled; vendor preset: disabled)
   Active: active (running) since Mon 2023-09-18 16:26:43 IST; 2 days ago
     Docs: https://www.elastic.co/beats/filebeat
 Main PID: 14193 (filebeat)
````Preformatted text`

Thanks,
Debasis

Hi @ppisljar ,

I followed the same doc to install filebeat and it is up and running as per the below command. Is there any way to validate the ingest pipeline if it is working properly or not?

sytemctl status filebeat

Thanks,
Debasis

HI @ppisljar ,

Could you please help here?

Thanks,
Debasis

You could use GET /_nodes/stats?metric=ingest&filter_path=nodes.*.ingest.pipelines to get statistics about your ingest pipeline to see failed count. You can run it from Dev Tools in Kibana.

Is there any way to validate the ingest pipeline if it is working properly or not?

You can use the simulate pipeline API to test ingest pipeline.