How to override @timestamp field by elasticsearch pipeline

Hi guys

I use filebeat to collect logs and send to elasticesearch directly. And the log contain the time such as

2017-12-12 15:41:30.000 [pool-8-thread-18] DEBUG c.h.i.c.m.servicemanager.task.PassiveNodeHbTask - PassiveNodeHbTask started

So i want to use the time in the log to override @timestamp and i don't want to use logstash

I have difined a pipeline:

   PUT _ingest/pipeline/my-pipeline-05
{
  "description" : "describe pipeline",
  "processors" : [
    {
      "grok": {
        "field": "message",
        "patterns": ["%{TIMESTAMP_ISO8601:timestamp}"]
      }
    }
  ]
}

and i use this pipeline, but it didn't override @timestamp it create another field name timestamp

here is json

{
  "_index": "tocc-2017.12.12",
  "_type": "doc",
  "_id": "rLe0SWAB2D7R7OG8M7Fc",
  "_version": 1,
  "_score": null,
  "_source": {
    "@timestamp": "2017-12-12T07:50:33.475Z",
    "log": {
      "level": "temp",
      "source": "temp",
      "type": "JAVA LOG"
    },
    "prospector": {
      "type": "log"
    },
    "source": "D:\\Program Files (x86)\\iVMS8600-TOCC\\log\\cms\\debug.log",
    "message": "2017-12-12 15:50:31.045 [pool-8-thread-16] DEBUG c.h.cms.cache.core.memory.SimpleMemoryCache - get key [#0#service_info_cache_version],value is:d1f9cef1-c406-4c27-9d8e-efadb92d5a1a",
    "timestamp": "2017-12-12 15:50:31.045"
  },
  "fields": {
    "@timestamp": [
      "2017-12-12T07:50:33.475Z"
    ]
  },
  "sort": [
    1513065033475
  ]
}

So what can i do to override the @timestamp field

Use the date processor.

thanks i have already try ,and the elasticsearch and filebeat didn't show any errors

But the i could get any message if i use the pipeline below

PUT _ingest/pipeline/my-pipeline-05
{
  "description" : "describe pipeline",
  "processors" : [
    {
      "grok": {
        "field": "message",
        "patterns": ["%{TIMESTAMP_ISO8601:logatime}"]
      },
      "date" : {
        "field" : "logatime",
        "target_field" : "@timestamp",
        "formats" : ["YYYY-MM-DD HH:mm:ss.SSS"]
      }
    }
  ]
}

And if i didn't use date processor

I could get the message.

So am i mistake something

You configuration parameters for the date filter does not seem to match what is described in the documentation, so I would recommend fixing that.

I didn't use logstash , in fact i use elasticsearch's pipelines and i follow the documentation.

Indeed. Not sure how I misread that. I would recommend that you compare the pattern against the docs and make sure all parts are specified using the correct case.

OK i will try ,and rewrite the date filter. I mean i didn't use logstash, and i misunderstand because i used believe maybe there have some different between the logstash's pipeline and elasticsearch's pipeline

Ok i find where is problem

Firstly, i need to use correct case

Second if you still could not find message , it could be you didn't set timezone, you need to set timezone and the problem would be solved