Integer precision lost when inserting into elasticsearch?


(Marcin Kuzmicki) #1

Hello,
I've got logstash (1.4.2) inserting data to ES (1.5.2)

Here's a sample command that I'm running

echo "ID: 1029713618753128208" | /opt/logstash/bin/logstash -e 'input { stdin {} }  filter { grok { match => ["message", "ID: %{NUMBER:id:int}"]} } output { stdout{ codec => rubydebug } elasticsearch { host=> "127.0.0.1" cluster=>"main" protocol=>http } } '

stdout codec prints correct number for my variable id:

{
       "message" => "ID: 1029713618753128208",
      "@version" => "1",
    "@timestamp" => "2015-06-29T10:50:06.463Z",
          "host" => "debian-vm1",
            "id" => 1029713618753128208
}

But elasticsearch when queried shows that id as

1029713618753128200

It almost looks like there's some kind of loss of precision on the integer.

Does like it look like a known issue ?

Thanks & regards
m


(Ed) #2

what does your query look like?

It is not rounding though maybe your hitting a max integer of the ES but don't think that would be a diff of 8

Also your mapping might shed some light


(Marcin Kuzmicki) #3

Aaah, good call eperry,

Turns out logstash/elasticsearch returns data correctly but kibana's display is screwed up.

I've raised issue on kibana 4.1.0

Thanks
m


(system) #4