Hi, I am new user to ELK (ver 6.5.1)
I want to create TimeSeries plots with Timelion on ELK Server using the "fields" captured and shipped from Filebeat on the Client side.
This log is generated continuously from application running on the client side
2018-11-05 00:38:17 bench CDataViewDB::GetData : 0.02ms, overall (n=33): mean 0.2ms, stddev 0.01ms, max 0.04ms, min 0.01, total time [0.00s]
I want to plot "mean", or "max" or stddev (each on on Y-axis) vs "timestamp" (which is on X-axis)
After "created index pattern" from Management screen, I can see the fields (min, max, mean, stddev) from filebeat-* from Discover screen.
Nothing is displayed from Timelion when I try to create (stddev on Y-axis, and timestamp on X-axis) plot with 'metric=' with expression:
.es(index=filebeat-*, timefield='@timestamp', metric='avg:stddev')
I can only see them displayed from query 'q='
.es(q=mean), .es(q=stddev), .es(q=max), .es(q=min)
Also the Y-axis becomes "count" instead of values
.es(q=mean:0.04), .es(q=stddev:0.03), .es(q=max:0.16), .es(q=min:0.01)
I also check the Type under Management - filebeat-* from the Filter box
stddev or mean, etc...
They all have "string" as Type. Should they be all "number" as Type?
Is this the reason why query .es(index=filebeat-*, timefield='@timestamp', metric='avg:stddev')
can not find "stddev"? (pls see step 8 at the bottom of the page)
My steps are detailed as following steps. Is there any steps I have missed or done incorrectly.
Thank you very much
BR,
Sam
Steps
(1) Setup a ELK on Host Server (Step 1 to 3 in the link) and Filebeat on Client (Step 4)
(2) Log lines captured from filebeat
2018-11-05 00:38:17 bench CDataViewDB::GetData : 0.02ms, overall (n=33): mean 0.2ms, stddev 0.01ms, max 0.04ms, min 0.01, total time [0.00s]
Grok Pattern (tested on Dev Tools - Grok Debugger)
%{TIMESTAMP_ISO8601:logdate} bench %{GREEDYDATA:typeofevent} : %{NUMBER:stddev}ms, overall (n=%{NUMBER:n}): mean %{NUMBER:mean}ms, stddev %{NUMBER:stddev}ms, max %{NUMBER:max}ms, min %{NUMBER:min}, total time [%{NUMBER:time}s]
Structured Data
{
"min": "0.01",
"max": "0.04",
"mean": "0.2",
"logdate": "2018-11-05 00:38:17",
"time": "0.00",
"stddev": "0.02",
"typeofevent": "CDataViewDB::GetData",
"n": "33"
}
(3) Entered the Grok pattern in the config file for logstash.
elktest@ELKServer:/etc/logstash/conf.d$ cat 10-syslog-filter.conf
filter {
if [input][type] == "log" {
grok {
match => { "message" => "%{TIMESTAMP_ISO8601:logdate} bench %{GREEDYDATA:typeofevent} : %{NUMBER:first_ms}ms, overall (n=%{NUMBER:n}): mean %{NUMBER:mean}ms, stddev %{NUMBER:stddev}ms, max %{NUMBER:max}ms, min %{NUMBER:min}, total time [%{NUMBER:time}s]" }
add_field => [ "received_at", "%{@timestamp}" ]
add_field => [ "received_from", "%{host}" ]
}
}
}
(4) Test Logstash configuration with this command:
sudo -u logstash /usr/share/logstash/bin/logstash --path.settings /etc/logstash -t
Configruation OK
(5) Verified Elasticsearch is receiving data from Filebeat index with this command:
test@ELKServer:/etc/logstash/conf.d$ curlcurl -XGET 'http://localhost:9200/filebeat-*/_search?pretty'
{
"took" : 9,
"timed_out" : false,
"_shards" : {
"total" : 35,
"successful" : 35,
"skipped" : 0,
"failed" : 0
},
"hits" : {
"total" : 2556058,
"max_score" : 1.0,
.......
(6) Now access Kibana user interface from Chrome Browser
- Click "Create index pattern" from Management -> Kibana -> Index Patterns
- Type filebeat-* under Index pattern input box
- Select @timestamp in Time filter field name and then Create.. button to finish
(7) Goto Discover and can see (min, max, mean, stddev) from filebeat-*
(8) Enter query from Timelion
.es(index=filebeat-*, timefield='@timestamp', metric='avg:stddev')
Nothing is displayed