Hello guys, I'm running Logstash to sync my ElasticSearch with my MongoDB operation.
I used the mongdb plugin to read from mongo, this is my initial config file:
input {
mongodb {
uri => 'mongodb://localhost:27017/db'
placeholder_db_dir => '/opt/logstash-mongodb/'
placeholder_db_name => 'logstash_sqlite.db'
collection => 'catalog'
}
}
output {
elasticsearch {
hosts => ['localhost:9200']
user => 'elastic'
password => 'xxxxx'
index => 'catalog'
}
}
I have a problem on syncronizind the json containing arrays.
This is my inserto on mongo:
db.getCollection('catalog').insert({
"schema" : {
"first" : {
"name" : "name",
"type" : "type",
"namespace" : "namespace",
"fields" : [
{
"name" : "first_field",
"type" : "string"
},
{
"name" : "second_field",
"type" : "string"
}
]
}
}
})
and this is the result on elasticsearch:
{
"_index": "catalog",
"_type": "doc",
"_id": "Jd_lL2IBwDdr4Pihholf",
"_score": 1,
"_source": {
"@version": "1",
"@timestamp": "2018-03-16T17:39:52.899Z",
"logdate": "2018-03-16T17:35:10+00:00",
"schema_first_name": "name",
"schema_first_namespace": "namespace",
"log_entry": """{"_id"=>BSON::ObjectId('5aac004edf7f789a5cbf6fe5'), "schema"=>{"first"=>{"name"=>"name", "type"=>"type", "namespace"=>"namespace", "fields"=>[{"name"=>"first_field", "type"=>"string"}, {"name"=>"second_field", "type"=>"string"}]}}}""",
"host": "",
"schema_first_type": "type",
"mongo_id": "5aac004edf7f789a5cbf6fe5"
}
}
i've lost the "fields" object.
I tried different solution, for example:
filter {
split {
field => "fields"
}
}
but I always received the follow error:
[logstash.filters.split ] Only String and Array types are splittable. field:fields is of type = NilClass_
Any suggestion?
Thanks in advance.
PPL