Graph time series data (Beginner)

Hi I am total beginner on Elasticsearch,logstash and Kibana.

I have a data format llike this

2015-03-05T16:46:10-05:00,36.23041940373926,45.53721791849107
2015-03-05T16:56:19-05:00,33.38862166196716,43.373693813236024
2015-03-05T17:06:19-05:00,38.093667546174146,48.09500247402276
2015-03-05T17:16:19-05:00,39.59381739755571,45.62129113468801
2015-03-05T17:26:19-05:00,38.82372654155496,45.68936598456894
2015-03-05T17:36:31-05:00,38.44134536505332,45.868852459016395
2015-03-05T17:46:43-05:00,39.329922429443805,44.39491497440977
2015-03-05T17:56:43-05:00,40.87433510638298,41.975103734439834
2015-03-05T18:06:56-05:00,43.599535115391,41.92262991864519
2015-03-05T18:17:15-05:00,43.88961892247043,42.02232435981615
2015-03-05T18:27:28-05:00,38.79637262984336,46.72712283594394
2015-03-05T18:37:35-05:00,38.92363396971692,44.91706355723436
2015-03-05T18:47:58-05:00,38.23903369357915,43.38422391857507
2015-03-05T18:58:36-05:00,40.83538083538084,37.47741829528658
2015-03-05T19:09:13-05:00,46.38612271121105,42.547729825124335
2015-03-05T19:19:43-05:00,42.31678486997636,42.88370520622042
2015-03-05T19:30:09-05:00,38.886134523219305,41.398116013882
2015-03-05T19:40:26-05:00,39.49152542372881,41.931104700492114
2015-03-05T19:50:36-05:00,39.57673721046493,40.516666666666666
2015-03-05T20:00:55-05:00,40.09702241552359,37.85320180571811
2015-03-05T20:11:13-05:00,47.281795511221944,39.65431278045538
2015-03-05T20:21:21-05:00,43.19441960293329,45.09944454398853
2015-03-05T20:31:31-05:00,40.7845068110947,42.580750942777506
2015-03-05T20:41:52-05:00,39.51650395165039,44.745815251084935
2015-03-05T20:52:19-05:00,38.8852883992223,44.381568310428456
2015-03-05T21:02:57-05:00,39.152240737744705,42.33576642335766
2015-03-05T21:13:39-05:00,48.67767270395897,47.006900978976084

it is a CSV.

From logstash I pushed the data to elasticsearch index.

However When I try to graph this data using kibana, I am completely lost.

I guess it may be because of the uneven time field.

How can I display this on Kibana ?

Regards.

Avc

What is the structure of your data in Elasticsearch? Is each item in the CSV a separate structure field? If so, you should be able to select the visualization you want in Visualize (e.g. line chart) do a Date Histogram on the X-Axis and either a count of the documents or a function over one of the fields (e.g. average) on the Y-Axis.

Hi,

Thanks for the reply. I have the fields like this.

Time,C1A,C1B
2015-03-05T16:46:10-05:00,36.23041940373926,45.53721791849107

The filed I select on Y-Axis is C1A(I don't know yet how to select both C1A and C1B)

And X axis I select Time

and it returns nothing.

Regards,

A

Could you share a screenshot with your configuration from Visualize?

Sorry for missing the conversation.

My csv file is this

Time,SPA,SPB
Thu Mar 12 16:10:25 EDT 2015,41.30672072,43.15346226
Thu Mar 12 16:20:55 EDT 2015,40.14174087,43.43324251
Thu Mar 12 16:31:03 EDT 2015,41.03425118,41.57152451
Thu Mar 12 16:41:03 EDT 2015,41.60066007,40.12864918
Thu Mar 12 16:51:03 EDT 2015,40.94268353,39.95073892
Thu Mar 12 17:01:03 EDT 2015,42.63400197,36.82393556
Thu Mar 12 17:11:18 EDT 2015,43.05921053,39.68384653
Thu Mar 12 17:21:34 EDT 2015,42.19471642,38.6971766
Thu Mar 12 17:31:34 EDT 2015,38.66909452,42.31278993
Thu Mar 12 17:41:34 EDT 2015,38.25136612,39.92710404
Thu Mar 12 17:51:34 EDT 2015,38.63862206,37.51243781
Thu Mar 12 18:01:34 EDT 2015,40.45046695,37.14704804

and my logstash configuration is

input {
file {
path => ["/home/user/sputils.csv"]
start_position => "beginning"
sincedb_path => "/dev/null"
}
}

filter {
csv {
columns => ["Time","SPA","SPB"]
separator => ","
}
date
{
match => [Time,"%{EEE MMM dd HH:mm:ss zzz YYYY}"]
}

mutate {convert =>["SPA", float] }
mutate {convert =>["SPB", float] }

}

output {
elasticsearch {
# action => "index"
# host => "localhost"
# index => "sutil"
# workers => 1
# }

stdout { codec => rubydebug}

}

I get,

"message" => [
[0] "Mon Mar 16 01:12:00 EDT 2015,56.9680939,57.83989414"
],
"@version" => "1",
"@timestamp" => "2015-07-24T14:10:58.529Z",
"host" => "0.0.0.0",
"path" => "/home/user/sputils.csv",
"Time" => "Mon Mar 16 01:12:00 EDT 2015",
"SPA" => 56.9680939,
"SPB" => 57.83989414

However "Time" is not the time type. it is getting generated as String.

I went through this,

But couldn't do much.

Regards

However "Time" is not the time type. it is getting generated as String.

This is expected. The date filter parses the supplied field and (by default) stores the result in the @timestamp field, which Kibana then uses. The source field (Time in your case) might as well be deleted since it's redundant. I believe the name of the field that Kibana uses for timestamps is configurable if you have a strong preference about the name of the field, but keep in mind that the index template that Logstash configures ES to use is specifically tailored for @timestamp.

However, note that the date filter failed to parse "Mon Mar 16 01:12:00 EDT 2015" so there's something wrong with your date pattern. Fix that first.

So I Have done it this way

curl -XPUT "http://localhost:9200/testnew" -d '

{ 
    "mappings": {
            "logs": {
                     "properties":{ 
                                "@timestamp":{"type":"date","format":"dateOptionalTime"},
                                "@version":{"type":"string"},
                                "SPA":{"type":"double"},
                                "SPB":{"type":"double"},
                                "Time":{"type":"date","format":"EEE MMM dd HH:mm:ss zzz YYYY"},
                                "host":{"type":"string"},
                                "message":{"type":"string"},
                                "path":{"type":"string"}
                                }
                       }
                 }               
    }'

Then I did an input of the Log into the elasticsearch via Logstash

Much a happy look now

Regards
A