Indexer Mongodb dans ES en temps réel

Logstash ne peut pas sérialiser les champs de type ObjectId, en l'occurence _id et end._id qu'il faut aussi exclure, Je pense via un statement de ce type

statement => "db.trips.find({},{'_id': false, 'end._id': false});"

Je pense que le mapping que tu as indiqué à Indexer Mongodb dans ES en temps réel devrait être bon.

Bonsoir,

Je vais tester ce mapping, mais là j'ai du mal à faire tourner la config logstash.
J'ai mis le jdbc driver dans le dossier config pour ne pas avoir de chemin, je n'ai plus l'exception mais j'ai une erreur maintenant :

  jdbc {
    jdbc_driver_library => "mongojdbc3.0.jar"
    jdbc_driver_class => "com.dbschema.MongoJdbcDriver"
    jdbc_connection_string => "jdbc:mongodb://localhost:27017/atseeone?authSource=admin"
	jdbc_user => ""
    jdbc_password => ""
    schedule => "*/30 * * * * *"
    statement => "db.trips.find({},{'_id': false});"
  }

Erreur :

  Pipeline_id:main
  Plugin: <LogStash::Inputs::Jdbc schedule=>"*/30 * * * * *", jdbc_password=><password>, statement=>"db.trips.find({},{'_id': false});", jdbc_driver_library=>"mongojdbc3.0.jar", jdbc_connection_string=>"jdbc:mongodb://localhost:27017/atseeone?authSource=admin", id=>"99473a9d2c60035f754dc0fdfc90d4e5ecd62b1a56c4c17de536ed564db5ef5a", jdbc_driver_class=>"com.dbschema.MongoJdbcDriver", enable_metric=>true, codec=><LogStash::Codecs::Plain id=>"plain_2a455c80-b95f-48fa-a164-dde8f9bb7545", enable_metric=>true, charset=>"UTF-8">, jdbc_paging_enabled=>false, jdbc_page_size=>100000, jdbc_validate_connection=>false, jdbc_validation_timeout=>3600, jdbc_pool_timeout=>5, sql_log_level=>"info", connection_retry_attempts=>1, connection_retry_attempts_wait_time=>0.5, plugin_timezone=>"utc", last_run_metadata_path=>"C:\\Users\\karou/.logstash_jdbc_last_run", use_column_value=>false, tracking_column_type=>"numeric", clean_run=>false, record_last_run=>true, lowercase_column_names=>true, use_prepared_statements=>false>
  Error: unable to load mongojdbc3.0.jar from :jdbc_driver_library, file not readable (please check user and group permissions for the path)
  Exception: LogStash::PluginLoadingError

we use rest service with json data (spring boot) for save into kafka and to push via connectors into mongo and elasticsearch or something else ,that work well and are scalable, the db reference is kakfa, in case of rebuild or migration on version elastic or mongo, that's near real time.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.