Elastic Search is indexing date fields wrong (milliseconds instead of seconds)

Hey folks!

I have mapping defined as follows:

PUT test1
{
"settings": {
    "index.mapping.ignore_malformed":true

  },
  "mappings": {
    "supersonic": {

      "dynamic": "true",
      "properties": {
        "date": {
        "type": "date",
        "format": "epoch_second"
        },
        "timestamp": {
        "type": "date",
        "format": "epoch_second"
        },
        "other_date": {
        "type": "date",
        "format": "epoch_millis"
        }
    }
  }
}
}

Now, "other_date" is indexed and shown perfectly fine. The problem ist, "date" and "timestamp" are treated as milliseconds as well, thus giving me dates in Jan 1970. The dates themselves arrive within a JSON. Am I missing something here?

Thanks for any help!

What Elasticsearch version are you using? And could you send the JSON of a document which you indexed which manifested this behavior?

Yes. The version is 2.1.1.
An example JSON would be:

> {"ipLongPeer":2130706433,
>    "biomass_cost":110,
>    "networkId":0,
>    "ipLong":3561950994,
>    "gameId":30,
>    "instanceId":0,
>    "tsEventAccepted":1450714430, 
>    "trevor_date":1450714430731,
>    "ipLongForwarded":3561950994,
>    "date":1450714430,
>    "production_type":3,
>    "entity_elements":"GA_1_0_0_2_0",
>    "version":"0.10.46",
>    "timestamp":1450714429,
>    "platform_id":-1,
>    "eventId":945,
>    "playerId":"mzaman",
>    "queue_size":1,
>    "entity_type":1100,
>    "production_id":"2015.12.21.17.13.45.8556370.3696111"}

The mapping, when fetched after the first events arrived, is:
> > {
> > "test": {
> > "mappings": {
> > "supersonic": {
> > "dynamic": "true",
> > "properties": {
> > "biomass_cost": {
> > "type": "double"
> > },
> > "date": {
> > "type": "date",
> > "format": "epoch_second"
> > },
> > "entity_elements": {
> > "type": "string"
> > },
> > "entity_type": {
> > "type": "double"
> > },
> > "eventId": {
> > "type": "double"
> > },
> > "gameId": {
> > "type": "double"
> > },
> > "instanceId": {
> > "type": "double"
> > },
> > "ipLong": {
> > "type": "double"
> > },
> > "ipLongForwarded": {
> > "type": "double"
> > },
> > "ipLongPeer": {
> > "type": "double"
> > },
> > "networkId": {
> > "type": "double"
> > },
> > "platform_id": {
> > "type": "double"
> > },
> > "playerId": {
> > "type": "string"
> > },
> > "production_id": {
> > "type": "string"
> > },
> > "production_type": {
> > "type": "double"
> > },
> > "queue_size": {
> > "type": "double"
> > },
> > "timestamp": {
> > "type": "date",
> > "format": "epoch_second"
> > },
> > "trevor_date": {
> > "type": "date",
> > "format": "epoch_millis"
> > },
> > "tsEventAccepted": {
> > "type": "double"
> > },
> > "version": {
> > "type": "string"
> > }
> > }
> > }
> > }
> > }
> > }

Tip from someone who's asked for help and provided it for years: format your json and wrap it in ``` and it'll be code formatted and folks are more likely to read it.

You're right, thanks for the hint :slight_smile:

I can't reproduce the issue:

curl -XDELETE localhost:9200/test1

curl -XPUT localhost:9200/test1 -d'{
  "settings": {
    "index.mapping.ignore_malformed":true

  },
  "mappings": {
    "supersonic": {

      "dynamic": "true",
      "properties": {
        "date": {
        "type": "date",
        "format": "epoch_second"
        },
        "timestamp": {
        "type": "date",
        "format": "epoch_second"
        },
        "other_date": {
        "type": "date",
        "format": "epoch_millis"
        }
    }
  }
}}'


curl -XPOST 'localhost:9200/test1/test/1?refresh&refresh' -d'{"ipLongPeer":2130706433,
    "biomass_cost":110,
    "networkId":0,
    "ipLong":3561950994,
    "gameId":30,
    "instanceId":0,
    "tsEventAccepted":1450714430, 
    "trevor_date":1450714430731,
    "ipLongForwarded":3561950994,
    "date":1450714430,
    "production_type":3,
    "entity_elements":"GA_1_0_0_2_0",
    "version":"0.10.46",
    "timestamp":1450714429,
    "platform_id":-1,
    "eventId":945,
    "playerId":"mzaman",
    "queue_size":1,
    "entity_type":1100,
    "production_id":"2015.12.21.17.13.45.8556370.3696111"}'

curl -XPOST localhost:9200/test1/test/_search?pretty -d'{
  "query": {
    "range": {
      "date": {
        "lt": "2000-1-1T00:00:00",
        "format": "date_optional_time"
      }
    }
  }
}'
# No results

curl -XPOST localhost:9200/test1/test/_search?pretty -d'{
  "query": {
    "range": {
      "date": {
        "gt": "2015-1-1T23:00:00",
        "lt": "2016-1-1T23:00:00",
        "format": "date_optional_time"
      }
    }
  }
}'
# One result

Sorry!

The problem became obsolete b/c I changed the dynamic mapping so it would automatically assign any field that contains "data", "timestamp" etc to date, so for that I need to send every date field in a homogenous format, as milliseconds in my case. It is a mystery why Icouldn't make it work before though.
But Thanks anyways, I appreciate the help :slight_smile: