Kibana Add New Metric

Hello,
I m a newbie just started to use ELK,i have a log with the SQL excution time, i would like to create new visualization which will hv Y axe Excution time X axe timeline, unfortunetly kibana gives me just some predefined metric count max min .....in the Y axe, is there a way i can add my field as metric?

Thank you so much

I think you what want is the average execution time. This is quite easy to get with Kibana, if you have your data properly indexed in Elasticsearch.
How does your data looks like in Elasticsearch?

Thank u for your answer , when i click average then when try select my field can't find it, below is the mapping Execution_time

{
"logstash-sniffer" : {
"mappings" : {
"default" : {
"dynamic_templates" : [
{
"message_field" : {
"path_match" : "message",
"match_mapping_type" : "string",
"mapping" : {
"norms" : false,
"type" : "text"
}
}
},
{
"string_fields" : {
"match" : "",
"match_mapping_type" : "string",
"mapping" : {
"fields" : {
"keyword" : {
"ignore_above" : 256,
"type" : "keyword"
}
},
"norms" : false,
"type" : "text"
}
}
}
],
"properties" : {
"@timestamp" : {
"type" : "date"
},
"@version" : {
"type" : "keyword"
},
"geoip" : {
"dynamic" : "true",
"properties" : {
"ip" : {
"type" : "ip"
},
"latitude" : {
"type" : "half_float"
},
"location" : {
"type" : "geo_point"
},
"longitude" : {
"type" : "half_float"
}
}
}
}
},
"doc" : {
"dynamic_templates" : [
{
"message_field" : {
"path_match" : "message",
"match_mapping_type" : "string",
"mapping" : {
"norms" : false,
"type" : "text"
}
}
},
{
"string_fields" : {
"match" : "
",
"match_mapping_type" : "string",
"mapping" : {
"fields" : {
"keyword" : {
"ignore_above" : 256,
"type" : "keyword"
}
},
"norms" : false,
"type" : "text"
}
}
}
],
"properties" : {
"@timestamp" : {
"type" : "date"
},
"@version" : {
"type" : "keyword"
},
"Execution_time" : {
"type" : "text",
"norms" : false,
"fields" : {
"keyword" : {
"type" : "keyword",
"ignore_above" : 256
}
}
},
"IP" : {
"type" : "text",
"norms" : false,
"fields" : {
"keyword" : {
"type" : "keyword",
"ignore_above" : 256
}
}
},
"TIME" : {
"type" : "text",
"norms" : false,
"fields" : {
"keyword" : {
"type" : "keyword",
"ignore_above" : 256
}
}
},
"WORD" : {
"type" : "text",
"norms" : false,
"fields" : {
"keyword" : {
"type" : "keyword",
"ignore_above" : 256
}
}
},
"geoip" : {
"dynamic" : "true",
"properties" : {
"ip" : {
"type" : "ip"
},
"latitude" : {
"type" : "half_float"
},
"location" : {
"type" : "geo_point"
},
"longitude" : {
"type" : "half_float"
}
}
},
"host" : {
"type" : "text",
"norms" : false,
"fields" : {
"keyword" : {
"type" : "keyword",
"ignore_above" : 256
}
}
},
"message" : {
"type" : "text",
"norms" : false
},
"path" : {
"type" : "text",
"norms" : false,
"fields" : {
"keyword" : {
"type" : "keyword",
"ignore_above" : 256
}
}
},
"tags" : {
"type" : "text",
"norms" : false,
"fields" : {
"keyword" : {
"type" : "keyword",
"ignore_above" : 256
}
}
},
"test" : {
"type" : "text",
"norms" : false,
"fields" : {
"keyword" : {
"type" : "keyword",
"ignore_above" : 256
}
}
}
}
}
}
}
}

As I expected, execution time is stored as text.
While to you the data stored might look like numbers, but for Elasticsearch (and so for Kibana too) they are text. And you can't do calculations on text, only on numbers.

You need to store the field Execution_time as number.

Thank u soo much it musts be that, i know it is beyond this thread, can u plz help me how to change it whether from logstash itself or elasticsearch?

Well, I've been actively analyzing ES only for one month. For my use case (custom apache web server logs) I've found explicit mapping to be useful, gives me more control, accuracy. It's also more work.
You are currently using dynamic mappings.

What I did is create an index template* and define mapping in the template.

*I'm having Logstash send data into indicies based on the event's month

I'd recommend reading about it.

Index templates
Index mapping
Dynamic mapping - this I haven't been studying yet

Note that you'll have to reindex your already indexed data if you define new mappings.

Thank u i just added mutate in logstash and it works :slight_smile:

mutate {
convert => {
"Execution_time" => "float"
}

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.