Pushing excel data with row and coulumn header in elastic search

Hello,
I need suggestion on how to proceed with this problem.
The following is my csv file..
|value_type |1-Sep|3-Sep|4-Sep|5-Sep|6-Sep|7-Sep|8-Sep|10-Sep|11-Sep|12-Sep|13-Sep|14-Sep|15-Sep|17-Sep|18-Sep|19-Sep|20-Sep|21-Sep|22-Sep|24-Sep|25-Sep|26-Sep|27-Sep|28-Sep|29-Sep|
|example1|0.00|10.45|18.65|16.85|16.65|17.95|3.25|20.25|16.90|16.45|15.40|14.05|2.85|23.50|16.95|15.25|17.70|15.00|0.00|21.90|15.60|15.35|13.95|13.50|4.60|
|example2|0.00|51.32|66.19|57.69|57.01|60.95|0.00|51.08|51.31|43.95|48.41|52.39|0.00|57.40|57.51|43.84|47.09|47.58|3.60|56.85|42.87|38.80|49.89|43.52|0.00|
there are row and column headers for my data.The row header is of the date format and column header is a string.I want to push this data to elastic search and later in kibana i would want to visulize the data based on the date(current,weekly,monthly).
Here is an example like the current visulization should display 'value_type' field as the x axis and roi on y axis such that roi specified on the current date.
Weekly visulization: 'value_type' field as the x axis and roi on y axis such that roi be the average value over the last week.
monthly visulization:'value_type' field as the x axis and roi on y axis such that roi be the average value over the entire month.
Please if somebody would guide me or provide some relevant urls on how to push the above data in elastic search and visulize in kibana..it would be much of a help.
ps:I am a beginner.
Thanks in advance

Hi,
well as simple as it seems if you don't process a lot of data stick to excel.
If you want Elasticsearch, you need Kibana and also Logstash which can process the data and send it to your Elasticsearch Instance. You need to write a Logstash event processing pipeline (https://www.elastic.co/guide/en/logstash/current/pipeline.html)
Your input is probably the csv file so you need the input plugin "file" (https://www.elastic.co/guide/en/logstash/current/plugins-inputs-file.html)
If you don't need to transform your data you might not need a filter plugin and can go straight to the output plugin "elasticsearch" (https://www.elastic.co/guide/en/logstash/current/plugins-outputs-elasticsearch.html)

In general I would recommend using Logstash with json files. So if you can provide your data within a json file with json structure the processing within Logstash will be much easier.

Example with json:

input{
	file {
			type => "typename"
			path => "foldertoyourfiles/*"
			codec => "json"
	#       start_position=> "beginning"
	  }
}

filter{
# if necessary put stuff here too!
}

output{
 elasticsearch {
         hosts=> ["127.0.0.1"]
         index=> "yourindexname-%{+YYYY.MM.dd}"
         codec => "json"
        }
}

Good Luck!

1 Like

You can also have a look at

Note that in Elasticsearch 6.6.0 you now have a CSV importer in Kibana which is experimental but might be useful. Have a look at https://www.elastic.co/guide/en/kibana/6.6/xpack-ml.html

1 Like

Hello,thank you for the reply but the thing is my logstash is not working..whenever i run the config file..the output gets stuck.i have already raised this problem in the forum.
So i have to do this with elastic search.I wanted to know how can i push csv data with row and column header to elastic search.

and also the row header is of date format..so i wanted to take that as timestamp.can i do that and how?
Thanks in advance

Well I would recommend fixing your logstash but you can also try to send data to elasticsearch directly.
You can read here how to Index Data using curl or via Kibana DevTools: https://www.elastic.co/guide/en/elasticsearch/guide/master/index-doc.html

Good Luck!

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.