ELK logstash config statement error for mongodb input-jdbc plugin

Hi All

I am new to ELK and MongoDB too, trying to view MongoDB data into Kibana.
But but index is not getting created due to following error

	[2019-12-05T16:49:14,073][DEBUG][org.mongodb.driver.protocol.command][main] Execution of command with request id 9 completed successfully in 56.27 ms on connection [connectionId{localValue:2, serverValue:250}] to server localhost:27017
[2019-12-05T16:49:14,818][ERROR][logstash.inputs.jdbc     ][main] Java::JavaSql::SQLException: <eval>:1:7 Expected ; but found count
SELECT count(*) AS "COUNT" FROM (db.testcaseresults.find({})) AS "T1" LIMIT 1
	   ^ in <eval> at line number 1 at column number 7: SELECT count(*) AS "COUNT" FROM (db.testcaseresults.find({})) AS "T1" LIMIT 1
[2019-12-05T16:49:15,009][WARN ][logstash.inputs.jdbc     ][main] Attempting a count query raised an error, the generated count statement is most likely incorrect but check networking, authentication or your statement syntax {"exception"=>"Java::JavaSql::SQLException: <eval>:1:7 Expected ; but found count\r\nSELECT count(*) AS \"COUNT\" FROM (db.testcaseresults.find({})) AS \"T1\" LIMIT 1\r\n       ^ in <eval> at line number 1 at column number 7"}
[2019-12-05T16:49:15,029][WARN ][logstash.inputs.jdbc     ][main] Ongoing count statement generation is being prevented
[2019-12-05T16:49:15,046][DEBUG][logstash.inputs.jdbc     ][main] Executing JDBC query {:statement=>"db.testcaseresults.find({})", :parameters=>{:sql_last_value=>2019-12-05 10:01:00 UTC}}
[2019-12-05T16:49:15,079][DEBUG][org.logstash.execution.PeriodicFlush][main] Pushing flush onto pipeline.
[2019-12-05T16:49:15,151][DEBUG][org.mongodb.driver.protocol.command][main] Sending command '{"listDatabases": 1, "nameOnly": true}' with request id 10 to database admin on connection [connectionId{localValue:2, serverValue:250}] to server localhost:27017
[2019-12-05T16:49:15,162][DEBUG][org.mongodb.driver.protocol.command][main] Execution of command with request id 10 completed successfully in 11.55 ms on connection [connectionId{localValue:2, serverValue:250}] to server localhost:27017
[2019-12-05T16:49:15,205][DEBUG][logstash.instrument.periodicpoller.cgroup] One or more required cgroup files or directories not found: /proc/self/cgroup, /sys/fs/cgroup/cpuacct, /sys/fs/cgroup/cpu
[2019-12-05T16:49:15,813][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2019-12-05T16:49:15,823][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2019-12-05T16:49:15,842][DEBUG][org.mongodb.driver.protocol.command][main] Sending command '{"find": "testcaseresults"}' with request id 11 to database automationframework on connection [connectionId{localValue:2, serverValue:250}] to server localhost:27017
[2019-12-05T16:49:15,847][DEBUG][org.mongodb.driver.protocol.command][main] Execution of command with request id 11 completed successfully in 5.74 ms on connection [connectionId{localValue:2, serverValue:250}] to server localhost:27017
[2019-12-05T16:49:15,917][INFO ][logstash.inputs.jdbc     ][main] (0.773581s) db.testcaseresults.find({})

And Following is My Configuration file as:

# Logstash configuration for creating a simple
# MongoDB -> Logstash -> Elasticsearch pipeline.

input {
  jdbc {
	jdbc_driver_library =>"mongojdbc1.6.jar"
	jdbc_driver_class => "com.dbschema.MongoJdbcDriver"
	jdbc_connection_string => "jdbc:mongodb://localhost:27017/automationframework"
	jdbc_validate_connection => true
	jdbc_user => ""
	#statement => "db.databasename.find({ timestampfield: { $gte: (:sql_last_value)}})"
	#statement => "select * from testcaseresults"
	statement => "db.testcaseresults.find({})"
	schedule => "*/10 * * * *"
	last_run_metadata_path => "D:\Installed\.logstash_jdbc_last_run"
  }
}

output {
  elasticsearch {
	hosts => ["http://localhost:9200"]
	index => "mongo_index"
  }
}

You can ignore this. The attempt to do a COUNT did not work, so the mixin will stop trying to do them.

I did not get you, index not created, what will be the solution?

I do not know. Is it possible the query returns no rows? Can you add

output { stdout { codec => rubydebug } }

and see if that shows any events. My previous answer was just saying that a failure to COUNT rows does not prevent events being processed.

@Badger, Since I did not posted whole log ,it was impossible to find error.
I found error, it is ConverterException,

[2019-12-10T12:07:41,813][WARN ][logstash.inputs.jdbc     ][main] Exception when executing JDBC query {:exception=>#<Sequel::DatabaseError: Java::OrgLogstash::MissingConverterException: Missing Converter handling for full class name=org.bson.types.ObjectId, simple name=ObjectId>} 

I tried to convert it in filter but same error,

filter{
	mutate {
		convert => {
			"_id" => "string"
			"testcaseID" => "string"
		}
	}
}

Now could you please tell me how to do conversion?

I think you would need to CAST the object in the SQL query.

Yes, I will look into that, I don't know MongoDB, I will learn it first.
Thank You.

When I see the error

SELECT count(*) AS "COUNT" FROM (db.testcaseresults.find({})) AS "T1" LIMIT 1

I think of the issuse I had this summer
https://discuss.elastic.co/t/jdbc-static-getting-a-sql-syntax-error-i-didnt-write/187844/4

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.