Kibana doesn't show the numerical tags values

Hi, I'm sending messages with whis configuration> 2015/01/06 07:15:43.073;1.849365
And I'm using this grok filter in logstash>

grok{
match => { "message" => "%{YEAR:year}/%{MONTHNUM:monthnum}/%{MONTHDAY:monthday} %{TIME:time};%{NUMBER:Value}"}
add_field => ["datetime", "%{year}/%{monthnum}/%{monthday} %{time}"]
}
date{
match => ["datetime", "yyyy/MM/dd HH:mm:ss"]
}
}

So at the end, what I have is the tags with it value in Kibana3, all right..
The problem is that when I try to represent the numerical value of the tag "Value" it doesn't show me anything... I'm trying using a histogram in this way.. but it doesn't appear anything.

In fact when I change "Chart Value" to Count it shows me correctly, but I want the total value

Use %{NUMBER:Value:int} instead of %{NUMBER:Value}. That field has already been mapped as a string so it might take until tomorrow's data until Kibana works as you'd expect (unless you reindex).

Hi!
I have used "Float" instead integer... but I'm having the same problem with the visualization... it is no able to show me the value in the Y axis

Check how the field has been mapped for the index in question (use ES's get mapping API). That's what matters.

It seems that it is considering it as a string...
I have used : localhost:9200/_mapping for seeing it. Is that right?
"Value":{"type":"multi_field","fields":{"Value":{"type":"string","omit_norms":true},"raw":{"type":"string","index":"not_analyzed","omit_norms":true,"index_options":"docs","include_in_all":false,"ignore_above":256}}},"datetime":{"type":"multi_field","fields":{"datetime":{"type":"string","omit_norms":true},"raw":

Yes, it's a string. You're presumably using daily indexes, so the next index you create should see the Value field correctly mapped.

Thanks for your response Magnus, How could I do it?

The field in today's index should be mapped as a number rather than a string so there's not much for you to do.

Indexes from today have the same problem.. I've had to add a mutate sentence in the logstash configuration filter.
mutate {
convert => { "Value" => "float" }
}

but I though that only with the grok filter sentence %{NUMBER:Value:float} would be enough..

Yes, %{NUMBER:Value:float} should've been enough. Can you create a minimal configuration example that exhibits the problem?

My filter is this one:
filter {
if [type] == "udp" {
mutate {
rename => ["@host", "host"]
}
dns {
reverse => ["host"]
action => "replace"
}
grok{
match => {"message" => "%{YEAR:year}/%{MONTHNUM:monthnum}/%{MONTHDAY:monthday} %{TIME:time};%{NUMBER:Value:float}"}
add_field => ["datetime", "%{year}/%{monthnum}/%{monthday} %{time}"]
}
date{
match => ["datetime", "yyyy/MM/dd HH:mm:ss"]
}

   **mutate {**
        **convert => { "Value" => "float" }**
    **}**


}

}

and the messages have this structure > 1970/01/06 08:44:45.304;2.098389

In fact when I use the grok debugger it shows me as a string too.

Works fine for me without the extra mutate filter:

$ cat data
1970/01/06 08:44:45.304;2.098389
$ cat test.config 
input { stdin { codec => plain } }
output { stdout { codec => rubydebug } }
filter {
  grok {
    match => [
      "message",
      "%{YEAR:year}/%{MONTHNUM:monthnum}/%{MONTHDAY:monthday} %{TIME:time};%{NUMBER:Value:float}"
    ]
  }
}
$ /opt/logstash/bin/logstash -f test.config < data
{
       "message" => "1970/01/06 08:44:45.304;2.098389",
      "@version" => "1",
    "@timestamp" => "2015-08-10T07:53:29.625Z",
          "host" => "seldlx20533",
          "year" => "1970",
      "monthnum" => "01",
      "monthday" => "06",
          "time" => "08:44:45.304",
         "Value" => 2.098389
}

Which version of Logstash are you running? Your reference to the @host field suggests that you're running something really ancient.

Hi @magnusbaeck, you´re showing logstash output here, but I think this is not the same as ES mapping.

Sometimes I saw the same behaviour, I think "data types" inside logstash´s pipeline are not related (almost directly) with "data types" in ES since logstash is not defining the mappings in a explicit way, am I right? ES has its own mechanisms for detecting types automatically.

Something like 2.098389 should be detected as float in ES, but this is not the case. Are you using templates? ( use this: http://localhost:9200/_template?pretty )

Indeed, if the OP is using an index template that maps the Value field as a string then that's indeed the problem, but I don't think that's the case. If the first Value field seen by ES for a given index is a float it will be mapped as a float.

Since the OP indicates that it works if a mutate filter that explicitly converts the field to a float then the problem seems to be on the Logstash side after all.