Too many events

Hi!

I am using jdbc plugin to make a query in an SQL server db and then i import the result of the query in ES.
In SQL, when i run select all i get 200.000 rows and after i export them into ES i get over 1 million events.
Does anyone have an idea why is this happening? Do i get duplicates event somehow?

Thanks!

What does the input configuration look like? Do you have an ORDER BY clause?

I have a SELECT * query and in the filter section i just get a substring and add it in a different field using grok filter.

grok {
			match => {
				"logevent" => 'MessageType":"?(?<messagetype>[a-zA-Z]+)'
			}
		}

That did not answer either of my questions...

input {
	jdbc {
	    charset => "UTF-8"
		jdbc_driver_library => "D:\sqljdbc_7.0\enu\mssql-jdbc-7.0.0.jre8.jar"
		jdbc_driver_class => "com.microsoft.sqlserver.jdbc.SQLServerDriver"
		jdbc_connection_string => "jdbc:sqlserver://*****************************"
		jdbc_user => "********"
		jdbc_password => "**************!"
		statement => "SELECT * FROM [dbo].[__Logs]"
		use_column_value => true
		tracking_column => "id"
	}
}
filter {
	if "MessageType" in [logevent] {
		grok {
			match => {
				"logevent" => 'MessageType":"?(?<messagetype>[a-zA-Z]+)'
			}
		}
	}
}
output {
	stdout {
		codec => json_lines {charset => "UTF-8"}
	}
	elasticsearch {
		hosts => "https://*****************************"
		user => "elastic"
		password => "**************"
		index => "count_of_events-%{+YYYY.MM.dd}"
		document_type => "logs"
	}
}

Does it work any better if you use this?...

statement => "SELECT * FROM [dbo].[__Logs] ORDER BY id"

I tried but i get again too many events

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.