Error while indexing data using elasticsearch output plugin

I have a usecase where I have a big xml input file (currently testing with a 36 MB file but can be larger) and I am using logstash pipeline to convert this xml to json and then dumping the json to elasticsearch. I am getting the exception in the output stage as below

    ERROR logstash.outputs.elasticsearch - Got a bad response code
from server, but this code is not considered retryable. Request will be dropped {:code=>400, :response_body=>"{\"error\"
:{\"root_cause\":[{\"type\":\"parse_exception\",\"reason\":\"Failed to derive xcontent\"}],\"type\":\"parse_exception\",
\"reason\":\"Failed to derive xcontent\"},\"status\":400}"} 

The logstash configuration is attached below

input {

#beats {
 #  port => 5044
  #}
  
  file {
  path => "D:\sdn_advanced\sdn_advanced.xml"
  start_position => "beginning"
  type => "sdn"
  codec => multiline {	
     pattern => "^<?xml"
     negate => "true"
     what => "previous"
	 max_lines => 1000000
	 max_bytes => "180 mb"
	 auto_flush_interval => 15
  }
   }
  
}

filter {

#ruby {
 # code => "event.set('correctmessage', event.get('message') + '</Sanctions>' + ' ' + 10.chr + ' ')"
#}

    xml {
    source => "message"
    target => "Sanctions"
    store_xml => "true"
    force_array => "false"
    }

    mutate {
    remove_field => ["path","@version","host","message","type","tags","correctmessage"]
    }

    }

    output {
      stdout {
      codec => json 
      }	
      
      elasticsearch {
      hosts => ["127.0.0.1:9200"]
      index => "sdn"
      retry_on_conflict => 3
      }
      
      #file {
      #path => "D:\Softwares\logstash-5.4.0\logstash-5.4.0\sdn_out.txt"
      #}
    }

Is there any max limit for the document size to be indexed into elasticsearch ??

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.