Hello.
I'm facing an issue when I try to write date fields to MongoDB, I've tried to convert it in many ways but with no success.
In my input JSON I have a collection like this:
"Colecs": [
{
"Value": {
"Value": "Tester"
},
"FieldName": {
"Value": "Name"
},
"Type": {
"Value": "String"
}
},
{
"FieldName": {
"Value": "DoB"
},
"Value": {
"Value": "7/17/2019 3:58:31 PM"
},
"Type": {
"Value": "DateTime"
}
}
{
"FieldName": {
"Value": "Height"
},
"Value": {
"Value": "1.75"
},
"Type": {
"Value": "Float"
}
},
],
In my config I use this Ruby filter:
ruby{
code => "
require 'date'
require 'time'
event.get('[Colecs]').each do |item|
varValue = item['Value']['Value']
varFieldName = item['FieldName']['Value']
varType = item['Type']['Value']
if varType == 'Integer'
event.set( varFieldName, varValue.to_i)
elsif varType == 'Float'
event.set( varFieldName, varValue.to_f)
elsif varType == 'DateTime'
event.set( varFieldName, { 'date' => varValue} ) <---Here is where I tried to convert
else
event.set( varFieldName, varValue.to_s)
end
end
"
}
For Output I'm using this:
mongodb {
uri => "mongodb://localhost:27017"
database => "TestMongo"
collection => "MyCollection"
#isodate => true
id => "writeInMongoDB"
codec => "json"
}
In MongoDB I get a collection like this:
{
"_id": "5d3ec9a70ebee3db90d8c4f2",
"Height": 1.75,
"DoB": {
"date": "7/17/2019 3:58:31 PM"
},
"Name": "Tester",
"@timestamp": "\"2019-07-29T10:25:42.344Z\""
}
What I can't do is to convert the [DoB][date] from string to date.
How can I do it?
I've already tried to convert it on Ruby filter in many ways with no success.
Using MongoDB-output isodate gives me an error, that's why it's commented.
Using Logstash date filter have no effect, the field doesn't change from string.
Got no success in every attempt.
Any ideas?