Hello,
I have an array of JSON and i can parse them out nicely with logstash. However, i realized that the logs are flooded with
[2019-07-23T10:35:06,859][WARN ][logstash.filters.split ] Only String and Array types are splittable. field:splitedJson is of type = NilClass
I am concern about this and would like to know how can i fix this.
my logstash:
input {
file {
path => ["/etc/logstash/conf.d/sysauto/periodic-curl-salt-data/salt-data/Host-LDAP"]
start_position => "beginning"
type => "json"
codec => "json"
sincedb_path => "/dev/null"
}
}
filter {
json { source => "message" target => "splitedJson" }
split { field => "splitedJson" }
}
output {
elasticsearch {
hosts => ["http://esnode1:9200", "http://esnode2:9200", "http://esnode3:9200"]
index => "sysauto-host-ldap-%{+YYYY.MM.dd}"
}
}
My sample data:
[
{"DC_ID":"3","DC_NAME":"BT","NTP1_ID":"4","NTP1_IP":"10.1.4.132","NTP1_NAME":"CCC-CCORE-DS01-2","NTP1_STATUS":"active","NTP2_ID":"8","NTP2_IP":"10.1.253.254","NTP2_NAME":"CCC-DSU-DS02-2","NTP2_STATUS":"active","SUBNET_ID":"160","SUBNET_IP":"10.1.101.0","SUBNET_MASK":"24"}
,
{"DC_ID":"3","DC_NAME":"BT","NTP1_ID":"3","NTP1_IP":"10.1.4.131","NTP1_NAME":"CCC-CCORE-DS01-3","NTP1_STATUS":"active","NTP2_ID":"5","NTP2_IP":"10.1.4.140","NTP2_NAME":"CCC-DSU-DS02-3","NTP2_STATUS":"active","SUBNET_ID":"79","SUBNET_IP":"10.1.2.0","SUBNET_MASK":"24"}
]
This is the data output:
{
"NTP1_IP" => "10.1.4.132",
"DC_NAME" => "BT",
"NTP2_IP" => "10.1.253.254",
"NTP1_STATUS" => "active",
"host" => "elk-dev-logstash",
"NTP1_NAME" => "CCC-CCORE-DS01-2",
"NTP1_ID" => "4",
"NTP2_STATUS" => "active",
"SUBNET_ID" => "160",
"NTP2_ID" => "8",
"DC_ID" => "3",
"@timestamp" => 2019-07-23T03:08:39.667Z,
"path" => "/etc/logstash/conf.d/sysauto/periodic-curl-salt-data/salt-data/Host-LDAP",
"tags" => [
[0] "_split_type_failure"
],
"NTP2_NAME" => CCC-DSU-DS02-2",
"type" => "json",
"SUBNET_IP" => "10.1.101.0",
"SUBNET_MASK" => "24",
"@version" => "1"
}
{
"NTP1_IP" => "10.1.4.131",
"DC_NAME" => "BT",
"NTP2_IP" => "10.1.4.140",
"NTP1_STATUS" => "active",
"host" => "elk-dev-logstash",
"NTP1_NAME" => "CCC-CCORE-DS01-3",
"NTP1_ID" => "3",
"NTP2_STATUS" => "active",
"SUBNET_ID" => "79",
"NTP2_ID" => "5",
"DC_ID" => "3",
"@timestamp" => 2019-07-23T03:08:39.667Z,
"path" => "/etc/logstash/conf.d/sysauto/periodic-curl-salt-data/salt-data/Host-LDAP",
"tags" => [
[0] "_split_type_failure"
],
"NTP2_NAME" => "CCC-DSU-DS02-3",
"type" => "json",
"SUBNET_IP" => "10.1.2.0",
"SUBNET_MASK" => "24",
"@version" => "1"
}
splitedJson
field does not exist in my data, it is used as a variable.
Is it wrong to use it this way?
I have tried to mutate { convert => { "splitedJson" => "string" } }
to convert the data type of splitedJson
also but i still receive the same warning.