Logstash keeps crashing using the JDBC Plugin

Hi All,

I have created the following index:

put newsindex
{
  "settings" : {
    "number_of_shards":3,
    "number_of_replicas":2
  },
  "mappings" : {
    "news": {
      "properties": {
        "data": {
          "type": "text"
        }
      }
    }
  }
}

In my logstash.conf file I have declared:

input {
  jdbc {
    jdbc_driver_library => "C:\Program Files\Microsoft JDBC DRIVER 6.2 for SQL Server\sqljdbc_6.2\enu\mssql-jdbc-6.2.1.jre8"
    jdbc_driver_class => "com.microsoft.sqlserver.jdbc.SQLServerDriver"
    jdbc_connection_string => "jdbc:sqlserver://TST-DBS-20;user=Elasticsearch;password=elastic123;"
    jdbc_user => "Elasticsearch"
    statement => "SELECT NewsID, HeadLine, BodyText, DateSubmitted, Approved, UserName, Type, Caption, Author, Contact, StaffID, SocialClubRegionID, DateCreated, CreatedBy, LastModifiedDate, ModifiedBy
FROM [News].[dbo].[News] order by NewsID"
  }
}
filter {
}
output {
  elasticsearch {
    hosts => ["tst-sch-20:9200"]
    index => "newsindex"
    document_id => "%{NewsID}"
    user => "elastic"
    password => elastic123
  }
  stdout { codec => json }
}

After running the above I only have 1 record in my index. When I go to the elasticsearch log I have lots of stuff in there, but one thing I've found is:

java.lang.IllegalArgumentException: name cannot be empty string
at org.elasticsearch.index.mapper.ObjectMapper.(ObjectMapper.java:326)

all the database table columns are nullable except for NewsID being the primary key and as such some of the columns are null. Interestingly, I dont have anything called specifically 'Name'. The logstash logs give more info and suggests that my type column cant be null, how do I handle Nulls in the conversion process?

in the Logstash Logs I have:

[2017-11-02T08:37:42,489][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>"%{NewsID}", :_index=>"newsindex", :_type=>"", :_routing=>nil}, 2017-11-02T08:37:40.223Z %{host} %{message}], :response=>{"index"=>{"_index"=>"newsindex", "_type"=>"", "_id"=>"%{NewsID}", "status"=>400, "error"=>{"type"=>"illegal_argument_exception", "reason"=>"name cannot be empty string"}}}}

I have changed the select statement wrapping each column in an IsNull(xxxxx,'') style synatax and still receive the same error. I guess because NULL is not an empty string, but elasticsearch should be able to handle an empty string, so too should logstash.

interstingly when I use Kibana and perform a query match_all against the index, in the output window for the 1 record that has been placed in the index I can see things like caption = "" for those that could be empty. so it looks as though empty strings can be saved. what is going on totally confused.

I needed to add the following to the jdbc input:

use_column_value => true
tracking_column => "NewsID"

Additionally all the columns that could have an empty string needed to be changed to be in the following form on my SQL Statement:

NULLIF(UserName,'') as UserName

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.