Problem reading SQL Server Data

Morning all

I have a simple table

CREATE TABLE [dbo].[DataAuditTable](
[Id] [int] IDENTITY(1,1) NOT NULL,
[SystemAuditId] [int] NOT NULL,
[RevisionStamp] [datetime] NOT NULL,
[TableName] nvarchar NOT NULL,
[Action] nvarchar NOT NULL,
[Mappings] nvarchar NULL,
[UseCustomEncryption] [bit] NOT NULL,
[DataVersion] [tinyint] NOT NULL
)

Which I am trying to index in ElasticSearch. My aim is to use the DataChanged column which contains JSON data in Elasticsearch.

My config file looks like;

input
{

# Define DataAudit input

jdbc 
{
	# Set JDBC Options

	jdbc_driver_library => "C:\ELK\JDBC\sqljdbc_6.0\enu\sqljdbc42.jar"
	jdbc_driver_class => "com.microsoft.sqlserver.jdbc.SQLServerDriver"
	jdbc_connection_string => "jdbc:sqlserver://dev_sandpit\autotesting;databaseName=INRstarAudit;selectMethod=cursor"
	jdbc_user => "auditreader"
	jdbc_password => "auditreader"	

	# Persistence settings	

	use_column_value => true
	tracking_column => id
	record_last_run => true
	last_run_metadata_path => "c:\ELK\LogStash\logstash-5.0.0\persistence\LastId_DataAudit"

	# Schedule (cron-like) 

	schedule => "* * * * *" 

	# type
	
	type => "DataAudit"

	# Query

	statement => "SELECT Id, DataChanged,LEN(Datachanged) as Length FROM [INRstarAudit].[dbo].[DataAudit] WHERE Id > :sql_last_value"

}

}

#filter
#{

json

{

source => "DataChanged "

target => "DataChanged_nojson"

}

mutate

{

remove_field => "patient"

}

#}

output
{
file {
path => "c:\elk\temp\test.out"
}

elasticsearch

{

index => "audit"

document_type => "%{type}"

document_id => "%{id}"

hosts => "localhost:9200"

}

}

When I run the query in RazorSQL using the same jdbc driver it works ok, but in Logstash I get the output;

input
{

# Define DataAudit input

jdbc 
{
	# Set JDBC Options

	jdbc_driver_library => "C:\ELK\JDBC\sqljdbc_6.0\enu\sqljdbc42.jar"
	jdbc_driver_class => "com.microsoft.sqlserver.jdbc.SQLServerDriver"
	jdbc_connection_string => "jdbc:sqlserver://dev_sandpit\autotesting;databaseName=INRstarAudit;selectMethod=cursor"
	jdbc_user => "auditreader"
	jdbc_password => "auditreader"	

	# Persistence settings	

	use_column_value => true
	tracking_column => id
	record_last_run => true
	last_run_metadata_path => "c:\ELK\LogStash\logstash-5.0.0\persistence\LastId_DataAudit"

	# Schedule (cron-like) 

	schedule => "* * * * *" 

	# type
	
	type => "DataAudit"

	# Query

	statement => "SELECT Id, DataChanged,LEN(Datachanged) as Length FROM [INRstarAudit].[dbo].[DataAudit] WHERE Id > :sql_last_value"

}

}

#filter
#{

json

{

source => "DataChanged "

target => "DataChanged_nojson"

}

mutate

{

remove_field => "patient"

}

#}

output
{
file {
path => "c:\elk\temp\test.out"
}

elasticsearch

{

index => "audit"

document_type => "%{type}"

document_id => "%{id}"

hosts => "localhost:9200"

}

}

Any ideas why I am getting Nil?

When I run the query in RazorSQL using the same jdbc driver it works ok, but in Logstash I get the output;

You posted your configuration twice.

And that helps how? What so you need to investigate? Kind of hoping for some form of solution here rather than a comment on number of postings. If there is more information that would help it would greatly help to resolve the issue if that was mentioned.

Your post is hard to read, can you format the code with the code button - </>.

And that helps how?

When you said "... in Logstash I get the output" it sounded as if you intended to post logs or similar, but what followed was just a repetition if your configuration. That looked like a mistake.

Please post what you get from Logstash (use a stdout { codec => rubydebug } output) and what you expected to receive.

Thank you - much more helpful I understand now.

Basically I get very little back. The varchar columns come back as Null. Tried a variety of things but cannot get databack from the varchar fields. We are using Logstash fine for IIS logs import, but not for this. The purpose is to read in audit data and store off the main database, breaking out XML and JSON data so that we can query and report in a more efficient manner.

The data is read from a view which, when the same drivers are used in RazorSQL bring back data. Data example can be found at Data Example. The config file is Config File and the output is at Output File

Many thanks

Mark

Hmm. Then I'm not sure what's up.

I know - very weird. Initially thought it was down to the JDBC drivers - but they seem to work fine in RazorSQL. Will play a little more perhaps with the older version of Logstash and see if that has the same issues. From extensive googling I can see no limits on text sizes etc.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.