Logstash 7.0.1 not work line too large

Hi, I am having problems when I try to read a log file, which has lines of more than 8000 characters in length each, is it possible that logstash has a limitation about this, or do I have to configure something to work correctly?

This is my match filter.

match => {"message" => "%{DATA:Fecha}\s%{DATA:Hora}\s%{DATA:Domain}\s%{LOGLEVEL:NivelLog}\s%{DATA:LoggerMessageProcessor}\s-\smessage.id:\s%{DATA:message}\s/\sRESPONSE HOST:\s%{GREEDYDATA:RESPONSE}.*"}

And this is a tipical line of log:
2019-04-30 10:41:22,982 [[banco_provincia_legacy_domain].HTTP_LISTENER_GENERAL.worker.1340] INFO org.mule.api.processor.LoggerMessageProcessor - message.id: a496f080-6b4d-11e9-b6d6-005056a84e5a / RESPONSE HOST: OK|44|LA PROPUESTA LA TIENE EN ESTE MOMENTO EL TERM
(for try this, add at the end of this lines 6000 spaces)

Thank you.

What error are you seeing? What does your config look like?

# Sample Logstash configuration for creating a simple
# Beats -> Logstash -> Elasticsearch pipeline.

input {

	file{
		path => "C:/SuiteElastic/logs/*.log"
		start_position => "beginning"
		sincedb_path => "C:/SuiteElastic/since"
		codec => plain {
                    charset => "ISO-8859-1"
           }
		   

	}
	#beats {
	#		port => 5044
	#	}
}

filter {


	grok {
		#break_on_match => false
		#match => {"message" => "Marca: %{DATA:Marca} - Color: %{DATA:Color} - Modelo: %{GREEDYDATA:Modelo}"}
		match => {"message" => "%{DATA:Fecha}\s%{DATA:Hora}\s%{DATA:Domain}\s%{LOGLEVEL:NivelLog}\s%{DATA:LoggerMessageProcessor}\s-\smessage.id:\s%{DATA:message}\s/\sIP\sProcess:\s/%{URIHOST:IP Process}\s/\smessageIdProcess:\s%{DATA:messageIdProcess}\s/\sREQUEST HOST:\s%{GREEDYDATA:REQUEST}"}	
	}
	if "REQUEST" not in [tags] {
		grok {
			#match => {"message" => "Nombre: %{DATA:Nombre} - Apellido: %{DATA:Apellido} - DNI: %{GREEDYDATA:DNI}"}
			match => {"message" => "%{DATA:Fecha}\s%{DATA:Hora}\s%{DATA:Domain}\s%{LOGLEVEL:NivelLog}\s%{DATA:LoggerMessageProcessor}\s-\smessage.id:\s%{DATA:message}\s/\sRESPONSE HOST:\s%{GREEDYDATA:RESPONSE}.*"}
			remove_tag => ["_grokparsefailure"]
		}
	}

}



output {
stdout {}
  file {
    path => "C:/SuiteElastic/test.log" 
  }
  #elasticsearch {
    #hosts => ["http://127.0.0.1:9200/"]
    #index => "indice_1"
    #user => "elastic"
    #password => "changeme"
  #}
}

This is the config file, the error is that with large files doesn't log all lines, but if i remove the spaces at the end of each line it work fine, (i try this with the same file), is very is strange .

Do you have a line feed st the end of every event? How many events are missing ssinf?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.