Convert cloudflare unix nanoseconds timestamp to elastic search date time field

Hi Team,

Im facing trouble in converting unix nanoseconds to elastic search date time field. Please let me know how to create indices and import this data.

{
 "mappings": {
  "doc": {
   "properties": {
    "CacheCacheStatus": {"type": "string"},
	"CacheResponseBytes": {"type": "integer"},
	"CacheResponseStatus": {"type": "integer"},
	"CacheTieredFill":{"type": "boolean"},
        "EdgeStartTimestamp": {"type": "date"}
      }
  }
 }
}

Data

{"index":{"_index":"cloudflare-2018.09.01","_type":"log"}}
{"CacheCacheStatus":"unknown","CacheResponseBytes":32707,"CacheResponseStatus":200,"CacheTieredFill":false,"EdgeStartTimestamp":1535759910756000000}
{"index":{"_index":"cloudflare-2018.09.01","_type":"log"}}
{"CacheCacheStatus":"unknown","CacheResponseBytes":42537,"CacheResponseStatus":200,"CacheTieredFill":false,"EdgeStartTimestamp":1535759912213999872}

While importing into elastic search im getting below error.

{
    "create" : {
      "_index" : "cloudflare-2018.09.01",
      "_type" : "log",
      "_id" : "AWX7EQH50J_kGrzSCN-5",
      "status" : 400,
      "error" : {
        "type" : "mapper_parsing_exception",
        "reason" : "failed to parse [EdgeStartTimestamp]",
        "caused_by" : {
          "type" : "illegal_argument_exception",
          "reason" : "Invalid format: \"1535759912620000000\" is malformed at \"759912620000000\""
        }

Hey,

Elasticsearch does not yet have support for timestamps in nanosecond resolution. You could use an ingest pipeline to shorten the date accordingly, so that it fits within millisecond resolution.

--Alex

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.