Can't change field type

i have the following two fields on my mongodb collection
createdAt : 2019-05-04T09:47:38.767+00:00
updatedAt : 2019-05-04T09:47:41.027+00:00
i want to change both fields type to date, i've tried a lot of things and i end up with the logstash config file bellow
this is my logstash config file

      `input {
	mongodb {
		uri => 'mongodb://localhost:27017/admin'
		placeholder_db_dir => 'Desktop/okoknow12/'
                placeholder_db_name => 'logstash_sqlite.db'		
		collection => 'Transactions'
		batch_size => 202
		parse_method => "simple"
	        }
      }


    filter {
	date {
            match => ["createdAt", "yyyy-MM-dd HH:mm:ss.SSS Z"]
    target => "createdAtnew"
    timezone =>"UTC"              }

        


	

	 mutate {
                convert => { "amount" => "integer" }

               }
	mutate {
                convert => { "totalFee" => "integer" }

               }
mutate {
                rename => { "_id" => "id" }

               }

}
 
output {
 
elasticsearch {
 
hosts => ["localhost:9200"]
 index => "zaamahedhy12"

}

stdout { codec => rubydebug } 
}`

What do you get in the rubydebug output for createdAt and updatedAt?

Your date filter does not match your examples of those dates.

[2019-05-13T22:13:06,136][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2019-05-13T22:13:06,186][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"6.7.1"}
[2019-05-13T22:13:06,301][INFO ][logstash.agent           ] No persistent UUID file found. Generating new UUID {:uuid=>"672fe7a7-c23a-4aaf-b497-caf4b5515d43", :path=>"Desktop/okoknow15/uuid"}
[2019-05-13T22:13:32,793][INFO ][logstash.inputs.mongodb  ] Using version 0.1.x input plugin 'mongodb'. This plugin isn't well supported by the community and likely has no maintainer.
[2019-05-13T22:13:37,250][INFO ][logstash.pipeline        ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50}
[2019-05-13T22:13:38,929][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://localhost:9200/]}}
[2019-05-13T22:13:39,859][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://localhost:9200/"}
[2019-05-13T22:13:40,125][INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>6}
[2019-05-13T22:13:40,136][WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>6}
[2019-05-13T22:13:40,279][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//localhost:9200"]}
[2019-05-13T22:13:40,401][INFO ][logstash.outputs.elasticsearch] Using default mapping template

{

"tags" => [
    [0] "_dateparsefailure"
],
            "type" => "transfer",
       "createdAt" => "2019-05-09 11:51:30 UTC",
      "@timestamp" => 2019-05-13T16:01:06.258Z,
        "totalFee" => 30,
        "receiver" => "5cd3fe907d0ee06732a0de3a",
  "senderWalletId" => "5cd411d37d0ee06732a0de46",
            "txId" => "f709118a42da04f2cafae21e1344a06c8776208cb4d0d0bc00084eebbcb6a8d3",
          "status" => "success",
            "host" => "ubuntu",
         "logdate" => "2019-05-09T11:51:30+00:00",
        "@version" => "1",
          "amount" => 2000,
              "id" => "5cd414427d0ee06732a0de50",
             "__v" => 0,
          "sender" => "5cd411d27d0ee06732a0de45",
       "log_entry" => "{\"_id\"=>BSON::ObjectId('5cd414427d0ee06732a0de50'), \"txId\"=>\"f709118a42da04f2cafae21e1344a06c8776208cb4d0d0bc00084eebbcb6a8d3\", \"sender\"=>BSON::ObjectId('5cd411d27d0ee06732a0de45'), \"receiver\"=>BSON::ObjectId('5cd3fe907d0ee06732a0de3a'), \"senderWalletId\"=>BSON::ObjectId('5cd411d37d0ee06732a0de46'), \"receiverWalletId\"=>BSON::ObjectId('5cd3fe907d0ee06732a0de3b'), \"status\"=>\"success\", \"type\"=>\"transfer\", \"amount\"=>2000, \"totalFee\"=>30, \"createdAt\"=>2019-05-09 11:51:30 UTC, \"updatedAt\"=>2019-05-09 11:51:30 UTC, \"__v\"=>0}",
        "mongo_id" => "5cd414427d0ee06732a0de50",
       "updatedAt" => "2019-05-09 11:51:30 UTC",
"receiverWalletId" => "5cd3fe907d0ee06732a0de3b"

}`
yah i know that's me desperate trying funny things to get my way out of it

That does not have milliseconds (.SSS), and it has a timezone name (ZZZ) not a timezone offset (Z or ZZ).

date {
    match => ["createdAt", "yyyy-MM-dd HH:mm:ss ZZZ"]
    target => "createdAtnew"
}

Gets me

   "createdAt" => "2019-05-09 11:51:30 UTC",
   "updatedAt" => "2019-05-09 11:51:30 UTC",
"createdAtnew" => 2019-05-09T11:51:30.000Z,

thanks for the reply, i was wondering what if i change the target to cratedAt and if so can i useit as a timefilter in kibana ?

Yes, if you use

date {
    match => ["createdAt", "yyyy-MM-dd HH:mm:ss ZZZ"]
    target => "createdAt"
}

then the string in the createdAt field will get replaced with a LogStash::Timestamp object. Once your index rolls over to a new day you can do an index refresh in Kibana and you should see it as a timestamp.

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.