Hi All
I am new to ELK and MongoDB too, trying to view MongoDB data into Kibana.
But but index is not getting created due to following error
[2019-12-05T16:49:14,073][DEBUG][org.mongodb.driver.protocol.command][main] Execution of command with request id 9 completed successfully in 56.27 ms on connection [connectionId{localValue:2, serverValue:250}] to server localhost:27017
[2019-12-05T16:49:14,818][ERROR][logstash.inputs.jdbc ][main] Java::JavaSql::SQLException: <eval>:1:7 Expected ; but found count
SELECT count(*) AS "COUNT" FROM (db.testcaseresults.find({})) AS "T1" LIMIT 1
^ in <eval> at line number 1 at column number 7: SELECT count(*) AS "COUNT" FROM (db.testcaseresults.find({})) AS "T1" LIMIT 1
[2019-12-05T16:49:15,009][WARN ][logstash.inputs.jdbc ][main] Attempting a count query raised an error, the generated count statement is most likely incorrect but check networking, authentication or your statement syntax {"exception"=>"Java::JavaSql::SQLException: <eval>:1:7 Expected ; but found count\r\nSELECT count(*) AS \"COUNT\" FROM (db.testcaseresults.find({})) AS \"T1\" LIMIT 1\r\n ^ in <eval> at line number 1 at column number 7"}
[2019-12-05T16:49:15,029][WARN ][logstash.inputs.jdbc ][main] Ongoing count statement generation is being prevented
[2019-12-05T16:49:15,046][DEBUG][logstash.inputs.jdbc ][main] Executing JDBC query {:statement=>"db.testcaseresults.find({})", :parameters=>{:sql_last_value=>2019-12-05 10:01:00 UTC}}
[2019-12-05T16:49:15,079][DEBUG][org.logstash.execution.PeriodicFlush][main] Pushing flush onto pipeline.
[2019-12-05T16:49:15,151][DEBUG][org.mongodb.driver.protocol.command][main] Sending command '{"listDatabases": 1, "nameOnly": true}' with request id 10 to database admin on connection [connectionId{localValue:2, serverValue:250}] to server localhost:27017
[2019-12-05T16:49:15,162][DEBUG][org.mongodb.driver.protocol.command][main] Execution of command with request id 10 completed successfully in 11.55 ms on connection [connectionId{localValue:2, serverValue:250}] to server localhost:27017
[2019-12-05T16:49:15,205][DEBUG][logstash.instrument.periodicpoller.cgroup] One or more required cgroup files or directories not found: /proc/self/cgroup, /sys/fs/cgroup/cpuacct, /sys/fs/cgroup/cpu
[2019-12-05T16:49:15,813][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2019-12-05T16:49:15,823][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2019-12-05T16:49:15,842][DEBUG][org.mongodb.driver.protocol.command][main] Sending command '{"find": "testcaseresults"}' with request id 11 to database automationframework on connection [connectionId{localValue:2, serverValue:250}] to server localhost:27017
[2019-12-05T16:49:15,847][DEBUG][org.mongodb.driver.protocol.command][main] Execution of command with request id 11 completed successfully in 5.74 ms on connection [connectionId{localValue:2, serverValue:250}] to server localhost:27017
[2019-12-05T16:49:15,917][INFO ][logstash.inputs.jdbc ][main] (0.773581s) db.testcaseresults.find({})
And Following is My Configuration file as:
# Logstash configuration for creating a simple
# MongoDB -> Logstash -> Elasticsearch pipeline.
input {
jdbc {
jdbc_driver_library =>"mongojdbc1.6.jar"
jdbc_driver_class => "com.dbschema.MongoJdbcDriver"
jdbc_connection_string => "jdbc:mongodb://localhost:27017/automationframework"
jdbc_validate_connection => true
jdbc_user => ""
#statement => "db.databasename.find({ timestampfield: { $gte: (:sql_last_value)}})"
#statement => "select * from testcaseresults"
statement => "db.testcaseresults.find({})"
schedule => "*/10 * * * *"
last_run_metadata_path => "D:\Installed\.logstash_jdbc_last_run"
}
}
output {
elasticsearch {
hosts => ["http://localhost:9200"]
index => "mongo_index"
}
}