33000 datapoints per second in a graph flowing from right to left


(Shashi S) #1

We have 10 million records of data stored in a Comma Separated CSV file and files like these are generated every 5 minutes.If we divide 10 million/5 minutes(300 seconds), its about 33000 records or data points for a second.

  1. Can we show these 33000 datapoints per second and the graph is flowing from right to left going on for 5 minutes and then it keeps on repeating for next file that comes after 5 minutes to show this data from another file( 10 million records generated every 5 minutes in each file) using ur tool?

  2. Also, what type of data input is required? Can ur tool read from Comma Separated Value CSV files. We have data stored in CSV comma separated value files.


(Chris Cowan) #2

For the graphing portion of your question that shouldn't be a problem for Kibana. Kibana alone doesn't read the data. It's an interface that sits on top of Elasticsearch and is used to visualize the data. So you will need to ingest your data (CSV) into Elasticsearch. The easiest way would be to setup Logstash to ingest each file when it becomes available.


(Shashi S) #3

Hi Chris,

Thanks for ur reply. Can Kibana handle 33000 data points per second flowing from right to left for 5 minutes and then refresh the chart with another 5 minutes of 33000 data points per second flowing from right to left.

Thanks again for ur timely response.


(Chris Cowan) #4

Yes... Visualizing the data shouldn't be a problem


(Shashi S) #5

input {
file {
path => "/Users/shashi/CSVFiles/Core2/*.txt"

I know above can be used to read input from multiple files in Core2 directory of type txt.
My requirement is that multiple new files are being continuously added to this directory when available, for example every 5 minutes. what should i do to read these new files that arrived after logstash processed all of the previous files


(Shashi S) #6

Hi Chris,

Thanks for ur reply. I have a small question.
suppose if we give below in logstash configuration file,

input {
file {
path => "/Users/shashi/CSVFiles/Core2/*.txt"

I know above can be used to read input from multiple files in Core2 directory of type txt.
My requirement is that multiple new files are being continuously added to this directory when available, for example every 5 minutes. what should i do to read these new files that arrived after logstash processed all of the previous files


(system) #7