Costi
(Diaconita Costinel)
March 5, 2019, 1:18pm
1
Hi!
I am using jdbc plugin to make a query in an SQL server db and then i import the result of the query in ES.
In SQL, when i run select all i get 200.000 rows and after i export them into ES i get over 1 million events.
Does anyone have an idea why is this happening? Do i get duplicates event somehow?
Thanks!
Badger
March 5, 2019, 1:51pm
2
What does the input configuration look like? Do you have an ORDER BY clause?
Costi
(Diaconita Costinel)
March 5, 2019, 2:05pm
3
I have a SELECT * query and in the filter section i just get a substring and add it in a different field using grok filter.
grok {
match => {
"logevent" => 'MessageType":"?(?<messagetype>[a-zA-Z]+)'
}
}
Badger
March 5, 2019, 2:10pm
4
That did not answer either of my questions...
Costi
(Diaconita Costinel)
March 5, 2019, 2:23pm
5
input {
jdbc {
charset => "UTF-8"
jdbc_driver_library => "D:\sqljdbc_7.0\enu\mssql-jdbc-7.0.0.jre8.jar"
jdbc_driver_class => "com.microsoft.sqlserver.jdbc.SQLServerDriver"
jdbc_connection_string => "jdbc:sqlserver://*****************************"
jdbc_user => "********"
jdbc_password => "**************!"
statement => "SELECT * FROM [dbo].[__Logs]"
use_column_value => true
tracking_column => "id"
}
}
filter {
if "MessageType" in [logevent] {
grok {
match => {
"logevent" => 'MessageType":"?(?<messagetype>[a-zA-Z]+)'
}
}
}
}
output {
stdout {
codec => json_lines {charset => "UTF-8"}
}
elasticsearch {
hosts => "https://*****************************"
user => "elastic"
password => "**************"
index => "count_of_events-%{+YYYY.MM.dd}"
document_type => "logs"
}
}
Badger
March 5, 2019, 2:28pm
6
Does it work any better if you use this?...
statement => "SELECT * FROM [dbo].[__Logs] ORDER BY id"
Costi
(Diaconita Costinel)
March 6, 2019, 7:38am
7
I tried but i get again too many events
system
(system)
Closed
April 3, 2019, 7:38am
8
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.