Cannot export Date fields to MongoDB

Hello.

I'm facing an issue when I try to write date fields to MongoDB, I've tried to convert it in many ways but with no success.

In my input JSON I have a collection like this:


"Colecs": [
		{
			"Value": {
				"Value": "Tester"
			},
			"FieldName": {
				"Value": "Name"
			},
			"Type": {
				"Value": "String"
			}
		},
		{
			"FieldName": {
				"Value": "DoB"
			},
			"Value": {
				"Value": "7/17/2019 3:58:31 PM"
			},
			"Type": {
				"Value": "DateTime"
			}
		}
		{
			"FieldName": {
				"Value": "Height"
			},
			"Value": {
				"Value": "1.75"
			},
			"Type": {
				"Value": "Float"
			}
		},
	],

In my config I use this Ruby filter:


ruby{
	code => "

		require 'date'
		require 'time'

		event.get('[Colecs]').each do |item|

			varValue = item['Value']['Value']
			varFieldName = item['FieldName']['Value']
			varType = item['Type']['Value']

			if varType == 'Integer'
				event.set( varFieldName, varValue.to_i)
			elsif varType == 'Float'			
				event.set( varFieldName, varValue.to_f)

			elsif varType == 'DateTime'

				event.set( varFieldName, { 'date' => varValue} )   <---Here is where I tried to convert

			else
				event.set( varFieldName, varValue.to_s)
			end
		end
	"
}

For Output I'm using this:


mongodb {
uri => "mongodb://localhost:27017"
database => "TestMongo"
collection => "MyCollection"
#isodate => true
id => "writeInMongoDB"
codec => "json"
}

In MongoDB I get a collection like this:


{
	"_id": "5d3ec9a70ebee3db90d8c4f2",
	"Height": 1.75,
	"DoB": {
	      "date": "7/17/2019 3:58:31 PM"
	},
	"Name": "Tester",
	"@timestamp": "\"2019-07-29T10:25:42.344Z\""
}

What I can't do is to convert the [DoB][date] from string to date.
How can I do it?

I've already tried to convert it on Ruby filter in many ways with no success.
Using MongoDB-output isodate gives me an error, that's why it's commented.
Using Logstash date filter have no effect, the field doesn't change from string.
Got no success in every attempt.
Any ideas?

What was the configuration of the date filter?

If I try this:

date {
	match => [ "[DoB][date]", "ISO8601" ]
}

I get:

date {
	"_id": "5d4072530ebee3c344edb4f5"
	"@timestamp": "\"2019-07-30T16:37:39.228Z\"",
	"DoB": {
		"date": "7/17/2019 3:58:31 PM"
	},
	"tags": [
		"_dateparsefailure"
	],
	"Height": 1.75,
	"Name": "Tester"
}

And if I try this:

date {
	match => [ "[DoB][date]", "M/d/YYYY hh:mm:ss a" ]
}

I get:

{
	"_id": "5d4074520ebee32517168961",
	"Height": 1.75,
	"DoB": {
		"date": "7/17/2019 3:58:31 PM"
	},
	"Name": "Tester",
	"@timestamp": "\"2019-07-17T14:58:31.000Z\""
}

And finally, if I try this:

date {
	match => [ "[DoB][date]", "M/d/YYYY hh:mm:ss a" ]
        target => "[DoB][date]"
}

I get an error:

[WARN ][logstash.outputs.mongodb ] Failed to send event to MongoDB, retrying in 30 seconds {:event=>#LogStash::Event:0xd8bcf4, :exception=>#<ArgumentError: wrong number of arguments (2 for 1)>}

Do you get a stack trace with that error message?

Got this from Logstash log

For future reference, I solved my problem downgrading the logstash-output-mongodb from version 3.1.6 to 3.1.5

Now the date filter:

date {
	match => [ "[DoB][date]", "M/d/YYYY hh:mm:ss a" ]
        target => "[DoB][date]"
}

is working and MongoDB is receiving the field as date (it wasn't necessary to use the MongoDB output isodate option).

Thank you @Badger for answering me!