Only String and Array types are splittable. field:splitedJson is of type = NilClass

Hello,

I have an array of JSON and i can parse them out nicely with logstash. However, i realized that the logs are flooded with

[2019-07-23T10:35:06,859][WARN ][logstash.filters.split ] Only String and Array types are splittable. field:splitedJson is of type = NilClass

I am concern about this and would like to know how can i fix this.

my logstash:

input {
  file {
    path => ["/etc/logstash/conf.d/sysauto/periodic-curl-salt-data/salt-data/Host-LDAP"]
    start_position => "beginning"
    type => "json"
    codec => "json"
    sincedb_path => "/dev/null"
  }
}
filter {
  json { source => "message" target => "splitedJson" }
  split { field => "splitedJson" }
}
output {
  elasticsearch {
    hosts => ["http://esnode1:9200", "http://esnode2:9200", "http://esnode3:9200"]
    index => "sysauto-host-ldap-%{+YYYY.MM.dd}"
  }
}

My sample data:

[
{"DC_ID":"3","DC_NAME":"BT","NTP1_ID":"4","NTP1_IP":"10.1.4.132","NTP1_NAME":"CCC-CCORE-DS01-2","NTP1_STATUS":"active","NTP2_ID":"8","NTP2_IP":"10.1.253.254","NTP2_NAME":"CCC-DSU-DS02-2","NTP2_STATUS":"active","SUBNET_ID":"160","SUBNET_IP":"10.1.101.0","SUBNET_MASK":"24"}
,
{"DC_ID":"3","DC_NAME":"BT","NTP1_ID":"3","NTP1_IP":"10.1.4.131","NTP1_NAME":"CCC-CCORE-DS01-3","NTP1_STATUS":"active","NTP2_ID":"5","NTP2_IP":"10.1.4.140","NTP2_NAME":"CCC-DSU-DS02-3","NTP2_STATUS":"active","SUBNET_ID":"79","SUBNET_IP":"10.1.2.0","SUBNET_MASK":"24"}
]

This is the data output:

{
"NTP1_IP" => "10.1.4.132",
"DC_NAME" => "BT",
"NTP2_IP" => "10.1.253.254",
"NTP1_STATUS" => "active",
"host" => "elk-dev-logstash",
"NTP1_NAME" => "CCC-CCORE-DS01-2",
"NTP1_ID" => "4",
"NTP2_STATUS" => "active",
"SUBNET_ID" => "160",
"NTP2_ID" => "8",
"DC_ID" => "3",
"@timestamp" => 2019-07-23T03:08:39.667Z,
"path" => "/etc/logstash/conf.d/sysauto/periodic-curl-salt-data/salt-data/Host-LDAP",
"tags" => [
[0] "_split_type_failure"
],
"NTP2_NAME" => CCC-DSU-DS02-2",
"type" => "json",
"SUBNET_IP" => "10.1.101.0",
"SUBNET_MASK" => "24",
"@version" => "1"
}
{
"NTP1_IP" => "10.1.4.131",
"DC_NAME" => "BT",
"NTP2_IP" => "10.1.4.140",
"NTP1_STATUS" => "active",
"host" => "elk-dev-logstash",
"NTP1_NAME" => "CCC-CCORE-DS01-3",
"NTP1_ID" => "3",
"NTP2_STATUS" => "active",
"SUBNET_ID" => "79",
"NTP2_ID" => "5",
"DC_ID" => "3",
"@timestamp" => 2019-07-23T03:08:39.667Z,
"path" => "/etc/logstash/conf.d/sysauto/periodic-curl-salt-data/salt-data/Host-LDAP",
"tags" => [
[0] "_split_type_failure"
 ],
"NTP2_NAME" => "CCC-DSU-DS02-3",
"type" => "json",
"SUBNET_IP" => "10.1.2.0",
"SUBNET_MASK" => "24",
"@version" => "1"
}

splitedJson field does not exist in my data, it is used as a variable.
Is it wrong to use it this way?

I have tried to mutate { convert => { "splitedJson" => "string" } } to convert the data type of splitedJson also but i still receive the same warning.

You are using a json codec on the input, so you have no message field, so the json filter is a no-op, so the splitedJson field never gets created.

If the entire array is ingested as a single event then the json codec does the split for you. If you are ingesting the lines one at a time you will get errors for the lines containing the [ and the ]. But you can fix that with

if [message] drop {}

That will drop anything that the json codec cannot parse.

1 Like

Your solution works! However, the line you quoted gave me some error.

Writing it this way fixed it :slight_smile:

 filter {
  if [message] {
    drop { }
  }
}

Thank you Badger!

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.