Logstash JSON iteration failure

I have a logstash script

filter {

json{
source=>"message"
target=>"dat"
}
split {
field => "[dat]"
}
split {
field => "[dat][friends]"
}

}

json message is :
[
{
"_id": "5aaf44772d97f524d9d11519",
"index": 0,
"guid": "ba92bd1b-5f5c-448f-9cc3-0462e51d5d32",
"isActive": false,
"tags": [
"amet",
"aliquip",
"veniam",
"officia",
"incididunt",
"do",
"tempor"
],
"friends": [
{
"id": 0,
"name": "Tami Rodriguez"
},
{
"id": 1,
"name": "Amalia Whitehead"
},
{
"id": 2,
"name": "Welch Molina"
}
]
}]
similar values are present i gave a single json
I get the following warn:
[2018-03-19T12:16:05,609][WARN ][logstash.filters.split ] Only String and Array types are splittable. field:[dat][friends] is of type = NilClass

That indicates that there is no [dat][friends] field in the event. Comment out the split filters and inspect the resulting event with a stdout { codec => rubydebug } output. What does it look like?

{
"@version" => "1",
"@timestamp" => 2018-03-19T07:51:59.471Z,
"message" => " {",
"tags" => [
[0] "_jsonparsefailure"
]
}
{
"@version" => "1",
"@timestamp" => 2018-03-19T07:51:59.472Z,
"message" => " "_id": "5aaf44772d97f524d9d11519",",
"tags" => [
[0] "_jsonparsefailure"
]
}
{
"@version" => "1",
"@timestamp" => 2018-03-19T07:51:59.474Z,
"message" => " "index": 0,",
"tags" => [
[0] "_jsonparsefailure"
]
}
{
"@version" => "1",
"@timestamp" => 2018-03-19T07:51:59.475Z,
"message" => " "guid": "ba92bd1b-5f5c-448f-9cc3-0462e51d5d32",",
"tags" => [
[0] "_jsonparsefailure"
]
}
{
"@version" => "1",
"@timestamp" => 2018-03-19T07:51:59.475Z,
"message" => " "isActive": false,",
"tags" => [
[0] "_jsonparsefailure"
]
}
{
"@version" => "1",
"@timestamp" => 2018-03-19T07:51:59.475Z,
"message" => " "friends": [",
"tags" => [
[0] "_jsonparsefailure"
]
}
{
"@version" => "1",
"@timestamp" => 2018-03-19T07:51:59.476Z,
"message" => " {",
"tags" => [
[0] "_jsonparsefailure"
]
}
{
"@version" => "1",
"@timestamp" => 2018-03-19T07:51:59.476Z,
"message" => " "id": 0,",
"tags" => [
[0] "_jsonparsefailure"
]
}
{
"@version" => "1",
"@timestamp" => 2018-03-19T07:51:59.476Z,
"message" => " "name": "Tami Rodriguez"",
"tags" => [
[0] "_jsonparsefailure"
]
}
{
"@version" => "1",
"@timestamp" => 2018-03-19T07:51:59.476Z,
"message" => " },",
"tags" => [
[0] "_jsonparsefailure"
]
}
{
"@version" => "1",
"@timestamp" => 2018-03-19T07:51:59.476Z,
"message" => " {",
"tags" => [
[0] "_jsonparsefailure"
]
}
{
"@version" => "1",
"@timestamp" => 2018-03-19T07:51:59.476Z,
"message" => " "id": 1,",
"tags" => [
[0] "_jsonparsefailure"
]
}
{
"@version" => "1",
"@timestamp" => 2018-03-19T07:51:59.477Z,
"message" => " "name": "Amalia Whitehead"",
"tags" => [
[0] "_jsonparsefailure"
]
}
{
"@version" => "1",
"@timestamp" => 2018-03-19T07:51:59.478Z,
"message" => " },",
"tags" => [
[0] "_jsonparsefailure"
]
}
{
"@version" => "1",
"@timestamp" => 2018-03-19T07:51:59.479Z,
"message" => " {",
"tags" => [
[0] "_jsonparsefailure"
]
}
{
"@version" => "1",
"@timestamp" => 2018-03-19T07:51:59.480Z,
"message" => " "id": 2,",
"tags" => [
[0] "_jsonparsefailure"
]
}
{
"@version" => "1",
"@timestamp" => 2018-03-19T07:51:59.484Z,
"message" => " "name": "Welch Molina"",
"tags" => [
[0] "_jsonparsefailure"
]
}
{
"@version" => "1",
"@timestamp" => 2018-03-19T07:51:59.484Z,
"message" => " }",
"tags" => [
[0] "_jsonparsefailure"
]
}
{
"@version" => "1",
"@timestamp" => 2018-03-19T07:51:59.485Z,
"message" => " ]",
"tags" => [
[0] "_jsonparsefailure"
]
}
{
"@version" => "1",
"@timestamp" => 2018-03-19T07:51:59.485Z,
"message" => " }",
"tags" => [
[0] "_jsonparsefailure"
]
}

I have checked the json it is fine but logstash says JSON parse failure

The first error was due to whitespaces and was rectified Now i can see properly formatted json in output but i am getting a 400 error saying :

Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"jsonit", :_type=>"event", :_routing=>nil}, #LogStash::Event:0x60dbc5ff], :response=>{"index"=>{"_index"=>"jsonit", "_type"=>"event", "_id"=>"yzRdPWIBNf-7tezaKSsV", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse [dat]", "caused_by"=>{"type"=>"illegal_state_exception", "reason"=>"Can't get text on a START_OBJECT at 1:2009"}}}}}

It looks like the dat field has already been mapped as a string but now you're trying to store an object in that field. If you don't have anything important in the index you can just delete it and start over.

ok will try that and get back to you thanks

It's working but what if the schema is changing in the index how do i update the type once it has been stored/mappped??

The mapping of a field can't be changed without reindexing the data into a new index.

Thanks

Hi Magnus, now the inner fields are not searchable like earlier i have two fields dat as a string and dat.keyword as a string

I don't know what you mean. What does an example document look like? What kind of search do you want to do? What are the results?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.