Logstash error while processing ingest node pipeline

I am trying to implement a pipeline in logstash which takes data from a source, does a small field extraction using grok filter (divides the event into field 1 and rest). My output segment in the logstash config is of the following format:

hosts => ""
index => "<index_name>"
document_type => "<mapping_type>"
pipeline => "<pipeline_name>"
user => "elastic"
password => "changeme"
workers => 1

Now, there are 10 different values possible for field 1. Based on the value of field 1, the pipeline_name will change (so basically, I have created 10 different ingest node pipelines in my elasticsearch instance as well.

When I try to implement this, I get the following error in Windows Powershell:

[2017-10-27T15:02:52,538][INFO ][logstash.outputs.elasticsearch] Retrying individual bulk actions that failed or were rejected by the previous bulk request. {:count=>1}
[2017-10-27T15:02:54,548][INFO ][logstash.outputs.elasticsearch] retrying failed action with response code: 500 ({"type"=>"exception", "reason"=>"java.lang.IllegalArgumentException: java.lang.IllegalArgumentExce
ption: field[rest] size [40] doesn't match header size [6].", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"java.lang.IllegalArgumentException: field[rest] size [40] doesn't match header size [6
].", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"field[rest] size [40] doesn't match header size [6]."}}, "header"=>{"processor_type"=>"csv"}})

I am not being able to decipher this error. Any help would be appreciated.


This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.