Indexing through logstash


I'm using the following config file to index the data.

input {
jdbc {
jdbc_driver_library => "E:/softwares/sqljdbc-1.2.0.jar"
jdbc_driver_class => ""
jdbc_connection_string => "jdbc:sqlserver://;databaseName=jaggu"
jdbc_user => "user"
jdbc_password => "user"

statement => "Select * from ism where startDate between '2016-05-01' and '2016-06-01'"

jdbc_paging_enabled => "true"
jdbc_page_size => "100000"
grok {
    patterns_dir => "E:/softwares/ElasticSearch/logstash-1.3.3-flatjar/patterns"
    match => ["startDate", "%{YEAR:al_year}-%{MONTHNUM:al_month}-%{MONTHDAY:al_monthday} %{TIME:al_time}" ]
    add_field => [ "LogTime", "%{al_year}-%{al_month}-%{al_monthday} %{al_time}" ]
    match => [ "LogTime", "YYYY-MM-dd HH:mm:ss.SSS"] 
output {
stdout {
codec => "json"
elasticsearch { 
hosts => "localhost:9200"
index => "revenue"
document_type => "revenue"	

When I index the data from SQL Server using this config file, one or two records are missing in the index (Lets say, there are 2000 records in SQL Server table. If I index that table, only 1999/1998 records are being indexed). Tried indexing multiple times but the result is same. Is there anything that I should change in the config file?
Please suggest

Sanjay Reddy

Do you get everything in stdout? Is there anything in the logs?

thanks @warkolm
I have checked the log file. There was some conversion error. Because of that Some of the records are not indexed.
Thanks much for the reply :slight_smile: