Parsing mongodb input fully via logstash => elasticsearch

I'm parsing a mongodb input into logstash, the config file is as follows:

input {
    mongodb {
        uri => "<mongouri>"
        placeholder_db_dir => "<path>"
        collection => "modules"
        batch_size => 5000
    }
}
filter {
        mutate  {
            rename => { "_id" => "mongo_id" }
            remove_field => ["host", "@version"]
        }
        json   {
            source => "message"
            target => "log"
        } 
}
output {
        stdout {
            codec => rubydebug
        }
        elasticsearch {
            hosts => ["localhost:9200"]
            action => "index"
            index => "mongo_log_modules"
        }
}

Outputs 2/3 documents from the collection into elasticsearch.

{
    "mongo_title" => "user",
      "log_entry" => "{\"_id\"=>BSON::ObjectId('60db49309fbbf53f5dd96619'), \"title\"=>\"user\", \"modules\"=>[{\"module\"=>\"user-dashboard\", \"description\"=>\"User Dashborad\"}, {\"module\"=>\"user-assessment\", \"description\"=>\"User assessment\"}, {\"module\"=>\"user-projects\", \"description\"=>\"User projects\"}]}",
       "mongo_id" => "60db49309fbbf53f5dd96619",
        "logdate" => "2021-06-29T16:24:16+00:00",
    "application" => "mongo-modules",
     "@timestamp" => 2021-10-02T05:08:38.091Z
}
{
    "mongo_title" => "candidate",
      "log_entry" => "{\"_id\"=>BSON::ObjectId('60db49519fbbf53f5dd96644'), \"title\"=>\"candidate\", \"modules\"=>[{\"module\"=>\"candidate-dashboard\", \"description\"=>\"User Dashborad\"}, {\"module\"=>\"candidate-assessment\", \"description\"=>\"User assessment\"}]}",
       "mongo_id" => "60db49519fbbf53f5dd96644",
        "logdate" => "2021-06-29T16:24:49+00:00",
    "application" => "mongo-modules",
     "@timestamp" => 2021-10-02T05:08:38.155Z
}

Seems like the output of stdout throws un-parsable code into

"log_entry"

After adding "rename" fields "modules" won't add a field.

I've tried the grok mutate filter, but after the _id %{DATA}, %{QUOTEDSTRING} and %{WORD} aren't working for me.

I've also tried updating a nested mapping into the index, didn't seem to work either

Is there anything else I can try to get the FULLY nested code into elasticsearch?

I would mutate [log_entry] into JSON.

    mutate { gsub => [ "log_entry", "=>", ": " ] }
    mutate { gsub => [ "log_entry", "BSON::ObjectId\('([0-9a-z]+)'\)", '"\1"' ] }
    json { source => "log_entry" remove_field => [ "log_entry" ] }

will get you

       "_id" => "60db49309fbbf53f5dd96619",
     "title" => "user",
   "modules" => [
    [0] {
             "module" => "user-dashboard",
        "description" => "User Dashborad"
    },
    [1] {
             "module" => "user-assessment",
        "description" => "User assessment"
    },
    [2] {
             "module" => "user-projects",
        "description" => "User projects"
    }
],

Note that when the placeholder code in the mongodb input initializes the placeholder record for a collection, it sets its id to that of the first record in the collection. When it initializes the cursor to fetch data it fetches everything after the first record.

You cannot use a mongodb input to fetch the first record in the db (unless you can initialize the placeholder_db, stop logstash, modify the db using some other tooling, then restart logstash -- I am not sure whether this would work).

It works pretty well for most of the collections, that really great. Is it about how you write the second argument in the gsub? I tried writing the mutations for the rest sometimes it would take some values, other times it would stay the same, missing some in the process.

{
       "@version" => "1",
    "application" => "mongo-skills",
       "mongo_id" => "5ff86b44ff600800171ec9ab",
     "@timestamp" => 2021-10-05T07:52:21.633Z,
           "name" => "Gestión",
     "created_at" => "2021-01-08T14:25:08Z",
     "updated_at" => "2021-01-08T14:25:08Z",
        "logdate" => "2021-01-08T14:25:08+00:00",
           "host" => "DESKTOP-KPJ5TLR",
            "__v" => 0,
      "log_entry" => "{\"_id\": \"5ff86b44ff600800171ec9ab\", \"active\": true, \"transversal\": false, \"skill\": [\"5ff85b46ff600800171ec962\", \"5ff76a36ff600800171ec7ba\", \"5ff76a36ff600800171ec7bb\", \"5ff76a36ff600800171ec7bc\", \"5ff76a36ff600800171ec7be\", \"5ff76a36ff600800171ec7bf\", \"5ff76a36ff600800171ec7c6\", \"5ff76a36ff600800171ec7c8\", \"5ff76a36ff600800171ec7c9\"], \"name\": \"Gestión\", \"domain\": \"5ff768ebff600800171ec7b7\", \"created_at\": 2021-01-08 14:25:08 UTC, \"updated_at\": 2021-01-08 14:25:08 UTC, \"__v\": 0}"
}
{
       "@version" => "1",
    "application" => "mongo-skills",
       "mongo_id" => "5ff86bb7ff600800171ec9ac",
     "@timestamp" => 2021-10-05T07:52:21.637Z,
           "name" => "Investigación de usuarios",
     "created_at" => "2021-01-08T14:27:03Z",
     "updated_at" => "2021-01-08T14:27:03Z",
        "logdate" => "2021-01-08T14:27:03+00:00",
           "host" => "DESKTOP-KPJ5TLR",
            "__v" => 0,
      "log_entry" => "{\"_id\": \"5ff86bb7ff600800171ec9ac\", \"active\": true, \"transversal\": false, \"skill\": [\"5ff85b03ff600800171ec961\", \"5ff76b03ff600800171ec7fb\", \"5ff76b03ff600800171ec802\", \"5ff76b03ff600800171ec80c\", \"5ff76b03ff600800171ec815\", \"5ff76b03ff600800171ec818\", \"5ff76b03ff600800171ec820\", \"5ff76b03ff600800171ec821\", \"5ff76b03ff600800171ec822\", \"5ff76b03ff600800171ec823\"], \"name\": \"Investigación de usuarios\", \"domain\": \"5ff7698cff600800171ec7b8\", \"created_at\": 2021-01-08 14:27:03 UTC, \"updated_at\": 2021-01-08 14:27:03 UTC, \"__v\": 0}"
}
{
       "@version" => "1",
    "application" => "mongo-skills",
       "mongo_id" => "6005f0f420856a001839e791",
     "@timestamp" => 2021-10-05T07:52:21.646Z,
           "name" => "Soft Abilities",
     "created_at" => "2021-01-18T20:35:00Z",
     "updated_at" => "2021-01-18T20:35:00Z",
        "logdate" => "2021-01-18T20:35:00+00:00",
           "host" => "DESKTOP-KPJ5TLR",
            "__v" => 0,
      "log_entry" => "{\"_id\": \"6005f0f420856a001839e791\", \"active\": true, \"transversal\": false, \"skill\": [\"6005ed4120856a001839e790\"], \"name\": \"Soft Abilities\", \"domain\": \"5ff76884ff600800171ec7b6\", \"created_at\": 2021-01-18 20:35:00 UTC, \"updated_at\": 2021-01-18 20:35:00 UTC, \"__v\": 0}"
}
{
       "@version" => "1",
    "application" => "mongo-skills",
       "mongo_id" => "609451c8c200c300175310fa",
     "@timestamp" => 2021-10-05T07:52:21.649Z,
           "name" => "ignacio ",
     "created_at" => "2021-05-06T20:30:00Z",
     "updated_at" => "2021-05-06T20:30:00Z",
        "logdate" => "2021-05-06T20:30:00+00:00",
           "host" => "DESKTOP-KPJ5TLR",
            "__v" => 0,
      "log_entry" => "{\"_id\": \"609451c8c200c300175310fa\", \"active\": true, \"transversal\": false, \"skill\": [\"604b77ca79d1960017f55aca\", \"5ff76ef5ff600800171ec8b0\", \"5ff76ef5ff600800171ec8a7\"], \"name\": \"ignacio \", \"domain\": \"5ff75f1eff600800171ec7af\", \"created_at\": 2021-05-06 20:30:00 UTC, \"updated_at\": 2021-05-06 20:30:00 UTC, \"__v\": 0}"
}
{
       "@version" => "1",
    "application" => "mongo-skills",
       "mongo_id" => "60945209c200c300175310fb",
     "@timestamp" => 2021-10-05T07:52:21.651Z,
           "name" => "ignacio montero",
     "created_at" => "2021-05-06T20:31:05Z",
     "updated_at" => "2021-05-06T20:31:05Z",
        "logdate" => "2021-05-06T20:31:05+00:00",
           "host" => "DESKTOP-KPJ5TLR",
            "__v" => 0,
      "log_entry" => "{\"_id\": \"60945209c200c300175310fb\", \"active\": true, \"transversal\": false, \"skill\": [\"5ff76b76ff600800171ec828\", \"60180a275ea72a001713ea4f\", \"5ff76ef5ff600800171ec8a9\"], \"name\": \"ignacio montero\", \"domain\": \"5ff75f1eff600800171ec7af\", \"created_at\": 2021-05-06 20:31:05 UTC, \"updated_at\": 2021-05-06 20:31:05 UTC, \"__v\": 0}"

Tags seems to just be shown as two open brackets, so the failure doesn't matter much since it has no values.

{
        "logdate" => "2021-01-18T20:35:00+00:00",
     "@timestamp" => 2021-10-05T07:59:13.387Z,
     "updated_at" => "2021-01-18T20:35:00Z",
            "__v" => 0,
       "mongo_id" => "6005f0f420856a001839e791",
      "log_entry" => "{\"_id\": \"6005f0f420856a001839e791\", \"active\": true, \"transversal\": false, \"skill\": [\"6005ed4120856a001839e790\"], \"name\": \"Soft Abilities\", \"domain\": \"5ff76884ff600800171ec7b6\", \"created_at\": 2021-01-18 20:35:00 UTC, \"updated_at\": 2021-01-18 20:35:00 UTC, \"__v\": 0}",
           "name" => "Soft Abilities",
       "@version" => "1",
     "created_at" => "2021-01-18T20:35:00Z",
           "tags" => [
        [0] "_jsonparsefailure"
    ],
           "host" => "DESKTOP-KPJ5TLR",
    "application" => "mongo-skillsets",
            "_id" => "6005f0f420856a001839e791"
}

Any other suggestions on how to mutate would be greatly appreciated

If you are getting a _jsonparsefailure tag then the filter will log a message indicating what it objected to.

How would I go about finding the log message?

Here is my input for skills

input {
    mongodb {
        uri => "<uri>"
        placeholder_db_dir => "../opt/logstash-mongodb/"
        placeholder_db_name => "logstash_sqlite.db"
        collection => 'skills'
        batch_size => 5000
        add_field => {"application" => "mongo-skills"}
        codec => json
    }
}
filter {
    if [application] == "mongo-skills" { 
    mutate { gsub => [ "log_entry", "=>", ": " ] }
    mutate { remove_field => ["_id"] }
    mutate { gsub => [ "log_entry", "BSON::ObjectId\('([0-9a-z]+)'\)", '"\1"' ] }
    json { source => "log_entry" remove_field => [ "log_entry" ] }
    }
}
output {
    if [application] == "mongo-skills" {
        stdout {
            codec => rubydebug
        }
        elasticsearch {
            hosts => ["localhost:9200"]
            action => "index"
            index => "mongo_log_skills"
        }
    }
}

Output

{
           "host" => "DESKTOP-KPJ5TLR",
           "tags" => [
        [0] "_jsonparsefailure"
    ],
    "application" => "mongo-skills",
       "mongo_id" => "60a431126b6b6d001773ec3e",
        "logdate" => "2021-05-18T21:26:42+00:00",
            "__v" => 0,
     "updated_at" => "2021-05-18T21:26:42Z",
     "created_at" => "2021-05-18T21:26:42Z",
     "@timestamp" => 2021-10-05T16:33:08.254Z,
      "log_entry" => "{\"_id\": \"60a431126b6b6d001773ec3e\", \"active\": true, \"tags\": [], \"name\": \"CAMILA?;ABRIGO;cabrigo@garmendia.cl;desarrollador\", \"domain\": \"5ff7621eff600800171ec7b0\", \"__v\": 0, \"created_at\": 2021-05-18 21:26:42 UTC, \"updated_at\": 2021-05-18 21:26:42 UTC}",       
           "name" => "CAMILA?;ABRIGO;cabrigo@garmendia.cl;desarrollador",
       "@version" => "1"
}
{
           "host" => "DESKTOP-KPJ5TLR",
           "tags" => [
        [0] "_jsonparsefailure"
    ],
    "application" => "mongo-skills",
       "mongo_id" => "60a431126b6b6d001773ec3f",
        "logdate" => "2021-05-18T21:26:42+00:00",
            "__v" => 0,
     "updated_at" => "2021-05-18T21:26:42Z",
     "created_at" => "2021-05-18T21:26:42Z",
     "@timestamp" => 2021-10-05T16:33:08.255Z,
      "log_entry" => "{\"_id\": \"60a431126b6b6d001773ec3f\", \"active\": true, \"tags\": [], \"name\": \"IGNACIO?;ABRIGO;Sb.nashxo.sk8@hotmail.com;desarrollador\", \"domain\": \"5ff7621eff600800171ec7b0\", \"__v\": 0, \"created_at\": 2021-05-18 21:26:42 UTC, \"updated_at\": 2021-05-18 21:26:42 UTC}", 
           "name" => "IGNACIO?;ABRIGO;Sb.nashxo.sk8@hotmail.com;desarrollador",
       "@version" => "1"
}
{
           "host" => "DESKTOP-KPJ5TLR",
           "tags" => [
        [0] "_jsonparsefailure"
    ],
    "application" => "mongo-skills",
       "mongo_id" => "60a431126b6b6d001773ec40",
        "logdate" => "2021-05-18T21:26:42+00:00",
            "__v" => 0,
     "updated_at" => "2021-05-18T21:26:42Z",
     "created_at" => "2021-05-18T21:26:42Z",
     "@timestamp" => 2021-10-05T16:33:08.257Z,
      "log_entry" => "{\"_id\": \"60a431126b6b6d001773ec40\", \"active\": true, \"tags\": [], \"name\": \"FRANCISCA?;ABUMOHOR;fran.afull@live.cl;desarrollador\", \"domain\": \"5ff7621eff600800171ec7b0\", \"__v\": 0, \"created_at\": 2021-05-18 21:26:42 UTC, \"updated_at\": 2021-05-18 21:26:42 UTC}",    
           "name" => "FRANCISCA?;ABUMOHOR;fran.afull@live.cl;desarrollador",
       "@version" => "1"
}

It will get written to stdout, and probably also to /var/log/logstash/logstash-plain.log

If you are running logstash as a service then your service manager may provide a way to view stdout.

For the examples you give it looks like something has already parsed the log_entry, so there is no reason to use a json filter to do it all over again. The filter is probably objecting to

"created_at": 2021-05-18 21:26:42 UTC,

You could wrap that in quotes using

mutate { gsub => [ "log_entry", "(\d{4}-\d{2}-\d{2} \d{2}:\d{2}:\d{2} \w+,)", '"\1"' ] }

but, as I said, it has already been parsed, so why bother?

but, as I said, it has already been parsed, so why bother?

True, it was already parsed! I removed that json filter to not do it again. When you add \d{4} does that wrap the "<value"\ of the text? If I maybe wanted to get a certain value more specifically?

I did input

input {
    if [application] == "mongo-skills" { 
        mongodb {
            uri => "<uri>"
            placeholder_db_dir => "../opt/logstash-mongodb/"
            placeholder_db_name => "logstash_sqlite.db"
            collection => '^skills$'
            batch_size => 5000
            add_field => {"application" => "mongo-skills"}
        }
    }
}
filter {
    if [application] == "mongo-skills" { 
        mutate {
            gsub => [ "log_entry", "(\d{4}-\d{2}-\d{2} \d{2}:\d{2}:\d{2} \w+,)", '"\1"' ]
            remove_field => ["_id"]
        }
        json { 
            source => "log_entry" 
            remove_field => [ "log_entry"] 
        }
    }
}

output {
    if [application] == "mongo-skills" {
        stdout {
            codec => rubydebug
        }
        elasticsearch {
            hosts => ["localhost:9200"]
            action => "index"
            index => "mongo_log_skills"
        }
    }
}

It will get written to stdout, and probably also to /var/log/logstash/logstash-plain.log

Great catch. Sharing the logstash folder \logs\logstash-plain.log shows the full rubydebug:

[2021-10-05T12:24:11,280][INFO ][logstash.runner          ] Log4j configuration path used is: C:\Users\Eduardo Diaz\elasticstacklocal\logstash-7.15.0\config\log4j2.properties

[2021-10-05T12:24:11,294][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"7.15.0", "jruby.version"=>"jruby 9.2.19.0 (2.5.8) 2021-06-15 55810c552b OpenJDK 64-Bit Server VM 11.0.11+9 on 11.0.11+9 +indy +jit [mswin32-x86_64]"}

[2021-10-05T12:24:11,454][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified

[2021-10-05T12:24:14,470][INFO ][logstash.monitoring.internalpipelinesource] Monitoring License OK

[2021-10-05T12:24:14,472][INFO ][logstash.monitoring.internalpipelinesource] Validated license for monitoring. Enabling monitoring pipeline.

[2021-10-05T12:24:14,968][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}

[2021-10-05T12:24:15,433][INFO ][org.reflections.Reflections] Reflections took 80 ms to scan 1 urls, producing 120 keys and 417 values 

[2021-10-05T12:24:18,840][INFO ][logstash.outputs.elasticsearchmonitoring][.monitoring-logstash] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearchMonitoring", :hosts=>["http://localhost:9200"]}

[2021-10-05T12:24:18,892][INFO ][logstash.outputs.elasticsearchmonitoring][.monitoring-logstash] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://localhost:9200/]}}

[2021-10-05T12:24:18,917][WARN ][logstash.outputs.elasticsearchmonitoring][.monitoring-logstash] Restored connection to ES instance {:url=>"http://localhost:9200/"}

[2021-10-05T12:24:18,937][INFO ][logstash.outputs.elasticsearchmonitoring][.monitoring-logstash] Elasticsearch version determined (7.14.1) {:es_version=>7}

[2021-10-05T12:24:18,940][WARN ][logstash.outputs.elasticsearchmonitoring][.monitoring-logstash] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>7}

[2021-10-05T12:24:19,064][WARN ][logstash.outputs.elasticsearchmonitoring][.monitoring-logstash] Configuration is data stream compliant but due backwards compatibility Logstash 7.x will not assume writing to a data-stream, default behavior will change on Logstash 8.0 (set `data_stream => true/false` to disable this warning)

[2021-10-05T12:24:19,063][WARN ][logstash.outputs.elasticsearchmonitoring][.monitoring-logstash] Configuration is data stream compliant but due backwards compatibility Logstash 7.x will not assume writing to a data-stream, default behavior will change on Logstash 8.0 (set `data_stream => true/false` to disable this warning)

[2021-10-05T12:24:19,084][WARN ][logstash.javapipeline    ][.monitoring-logstash] 'pipeline.ordered' is enabled and is likely less efficient, consider disabling if preserving event order is not necessary

[2021-10-05T12:24:19,166][INFO ][logstash.javapipeline    ][.monitoring-logstash] Starting pipeline {:pipeline_id=>".monitoring-logstash", "pipeline.workers"=>1, "pipeline.batch.size"=>2, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>2, "pipeline.sources"=>["monitoring pipeline"], :thread=>"#<Thread:0x235d59f run>"}

[2021-10-05T12:24:19,203][INFO ][logstash.outputs.elasticsearch][main] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//localhost:9200"]}

[2021-10-05T12:24:19,228][INFO ][logstash.outputs.elasticsearch][main] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://localhost:9200/]}}

[2021-10-05T12:24:19,238][WARN ][logstash.outputs.elasticsearch][main] Restored connection to ES instance {:url=>"http://localhost:9200/"}

[2021-10-05T12:24:19,244][INFO ][logstash.outputs.elasticsearch][main] Elasticsearch version determined (7.14.1) {:es_version=>7}

[2021-10-05T12:24:19,245][WARN ][logstash.outputs.elasticsearch][main] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>7}

[2021-10-05T12:24:19,282][INFO ][logstash.outputs.elasticsearch][main] Using a default mapping template {:es_version=>7, :ecs_compatibility=>:disabled}

[2021-10-05T12:24:19,309][INFO ][logstash.javapipeline    ][main] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>8, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>1000, "pipeline.sources"=>["C:/Users/Eduardo Diaz/elasticstacklocal/logstash-7.15.0/bin/configurations/skills.conf"], :thread=>"#<Thread:0x224f5cdf@C:/Users/Eduardo Diaz/elasticstacklocal/logstash-7.15.0/logstash-core/lib/logstash/pipeline_action/create.rb:54 run>"}

[2021-10-05T12:24:19,987][INFO ][logstash.javapipeline    ][.monitoring-logstash] Pipeline Java execution initialization time {"seconds"=>0.81}

[2021-10-05T12:24:20,071][INFO ][logstash.javapipeline    ][.monitoring-logstash] Pipeline started {"pipeline.id"=>".monitoring-logstash"}

[2021-10-05T12:24:20,139][INFO ][logstash.javapipeline    ][main] Pipeline Java execution initialization time {"seconds"=>0.83}

[2021-10-05T12:24:22,759][INFO ][logstash.inputs.mongodb  ][main] Registering MongoDB input

[2021-10-05T12:24:25,198][INFO ][logstash.inputs.mongodb  ][main] init placeholder for logstash_since_skills: {"_id"=>BSON::ObjectId('5ff76a36ff600800171ec7b9'), "active"=>true, "tags"=>[], "name"=>"CCNA", "domain"=>BSON::ObjectId('5ff768ebff600800171ec7b7'), "__v"=>0, "created_at"=>2021-01-07 20:08:22 UTC, "updated_at"=>2021-01-07 20:08:22 UTC}

[2021-10-05T12:24:25,201][INFO ][logstash.javapipeline    ][main] Pipeline started {"pipeline.id"=>"main"}

[2021-10-05T12:24:25,249][INFO ][logstash.agent           ] Pipelines running {:count=>2, :running_pipelines=>[:".monitoring-logstash", :main], :non_running_pipelines=>[]}

Is there a way to get the "domain" and "active" from the log_entry?

Output is currently:

{
     "created_at" => "2021-05-18T21:26:42Z",
     "updated_at" => "2021-05-18T21:26:42Z",
           "host" => "DESKTOP-KPJ5TLR",
           "name" => "MAYRA?;UGARTE;la_vakyta18@hotmail.com;IT Architect",
            "__v" => 0,
       "@version" => "1",
     "@timestamp" => 2021-10-05T18:24:28.066Z,
           "tags" => [],
       "mongo_id" => "60a431126b6b6d001773ed6b",
      "log_entry" => "{\"_id\"=>BSON::ObjectId('60a431126b6b6d001773ed6b'), \"active\"=>true, \"tags\"=>[], \"name\"=>\"MAYRA?;UGARTE;la_vakyta18@hotmail.com;IT Architect\", \"domain\"=>BSON::ObjectId('5ff7621eff600800171ec7b0'), \"__v\"=>0, \"created_at\"=>\"2021-05-18 21:26:42 UTC,\" \"updated_at\"=>2021-05-18 21:26:42 UTC}",
    "application" => "mongo-skills",
        "logdate" => "2021-05-18T21:26:42+00:00"
}
{
     "created_at" => "2021-05-18T21:26:42Z",
     "updated_at" => "2021-05-18T21:26:42Z",
           "host" => "DESKTOP-KPJ5TLR",
           "name" => "NELLY?;VALENZUELA;criatura11@hotmail.com;Software Developer",
            "__v" => 0,
       "@version" => "1",
     "@timestamp" => 2021-10-05T18:24:28.073Z,
           "tags" => [],
       "mongo_id" => "60a431126b6b6d001773ed73",
      "log_entry" => "{\"_id\"=>BSON::ObjectId('60a431126b6b6d001773ed73'), \"active\"=>true, \"tags\"=>[], \"name\"=>\"NELLY?;VALENZUELA;criatura11@hotmail.com;Software Developer\", \"domain\"=>BSON::ObjectId('5ff7621eff600800171ec7b0'), \"__v\"=>0, \"created_at\"=>\"2021-05-18 21:26:42 UTC,\" \"updated_at\"=>2021-05-18 21:26:42 UTC}",
    "application" => "mongo-skills",
        "logdate" => "2021-05-18T21:26:42+00:00"
}

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.