Why i can't add a date field

filter {
      grok{
           match => [
               "message","\[%{TIMESTAMP_ISO8601:TimeStamp}\]"
                    ]
          }            
     date {
		match => [ "TimeStamp", "ISO8601" ]
		target => "timestamp"
		timezone => "Asia/Shanghai"
	}

}

i input [2018-04-13 15:00:00.000]
[2018-04-13T15:36:14,470][WARN ][logstash.outputs.elasticsearch] Could not index
event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>
"logstash(2018.04.13)", :_type=>"logstash", :_routing=>nil}, #<LogStash::Event:0
x10ad1942>], :response=>{"index"=>{"_index"=>"logstash(2018.04.13)", "_type"=>"l
ogstash", "_id"=>"g0vuvWIBoK0nbYCH664-", "status"=>400, "error"=>{"type"=>"mappe
r_parsing_exception", "reason"=>"failed to parse [TimeStamp]", "caused_by"=>{"ty
pe"=>"illegal_argument_exception", "reason"=>"Invalid format: "2018-04-13 15:00
:00.000" is malformed at " 15:00:00.000""}}}}}
{
"@timestamp" => 2018-04-13T07:36:14.303Z,
"host" => "BIH-D-6331",
"@version" => "1",
"TimeStamp" => "2018-04-13 15:00:00.000",
"message" => "[2018-04-13 15:00:00.000]\r",
"timestamp" => 2018-04-13T07:00:00.000Z
}

Do you really need to keep the TimeStamp field that it's complaining about? If you remove it this problem will go away.

so if i add a new date field i need remove original field?

No, you can have as many date fields as you like. But ES apparently doesn't recognize "2018-04-13 15:00
:00.000" as a date. If you want to store it in a field you have to make sure it's a string field.

but when i changed TimeStamp it become ufc time zone

You're not changing TimeStamp so I don't know what you mean, but the date filter always produces UTC timestamps.

so how can i make a date field by my own time zone

My answer in I want to save index by my own time zone applies here too.

thank you .you are right but now i find a new problem that when i use date_histogram

  "aggs" : {
        "salestime" : {
            "date_histogram" : {
                "field" : "@timestamp",
                "interval" : "day"
            },
}
}
   

"aggregations": {
    "salestime": {
      "buckets": [
        {
          "key_as_string": "2018-04-02T00:00:00.000Z",
          "key": 1522627200000,

it use utc time zone . so can i use my time zone at here?

I don't know. I suggest you post a new topic in the Elasticsearch category as this isn't a Logstash question.

ok thank you

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.