Logstash se apaga después de estar ejecutándose unos segundos

Hi,

Every time I try to run logstash with my config file, it shuts down a few seconds after de service is started.

This is my config file:

input{

file {
    	path => "E:\DATA\test\*log"
	start_position => "beginning"
  	}

}
#fin de input

filter{

if [path] == "E:\DATA\test\bluecoat.log"{
	grok{

	patterns_dir => [".\Pattern\Pattern.txt"]

		match => { "message" => "%{SPACE:date} %{SPACE:time} %{SPACE:time_taken} %{SPACE:c_ip} %{SPACE:cs_username} %{SPACE:cs_auth_group} %{SPACE:x_exception_id} %{SPACE:sc_filter_result} %{QUOTE:cs_categories} %{SPACE:cs_referer}  %{SPACE:sc_status} %{SPACE:s_action} %{SPACE:cs_method} %{SPACE:rs_Content_Type} %{SPACE:cs_uri_scheme} %{SPACE:cs_host} %{SPACE:cs_uri_port} %{SPACE:cs_uri_path} %{SPACE:cs-uri-query} %{SPACE:cs_uri_extension} %{SPACE:cs_User_Agent} %{SPACE:s_ip} %{SPACE:sc_bytes} %{SPACE:cs_bytes} %{SPACE:x_virus_id} %{QUOTE:x_bluecoat_application_name} %{QUOTE:x_bluecoat_application_operation} %{SPACE:cs_auth_type} %{SPACE:x_auth_credential_type} %{SPACE:r_ip}" }

	}
	#Fin de grok

	geoip {
  		 source => "s_ip"
		}

}
#Fin de if BC

else if [path] == "E:\DATA\test\q"{

	grok{

	patterns_dir => [".\Pattern\Pattern.txt"]

		match => { "message" => "%{COLUMN:email}:%{SPACE:pass}" }

	}

}
#Fin de if data

if "_grokparsefailure" in [tags] {
drop { }
}

}
#Fin de filter

output {
elasticsearch { hosts => ["localhost:9200"] }
stdout { codec => rubydebug }
}

The custom patterns are as follows:

SPACE ([A-Za-z0-9]\S*|-\S*|/\S*)
QUOTE "[^"]"
COLUMN ^[^:]
\s*

I've tried the --config.test_and_exit with the config file, ant it says it is OK.

Could someone help me please?

Thanks.

Use forward slash (or \\) in the path option of a file input.

What do you see in the logstash log when it shuts down?

I've tried the config file using (, \ and /)but none of those worked. Since I'm on a Windows, I guess it should be either () or (\)

The output of stdout doesn't give me much informatio, right after starting it, the shut down log appears as follows:

[2019-09-02T01:33:08,770][INFO ][logstash.agent ] Successfully started
Logstash API endpoint {:port=>9600}
[2019-09-02T01:33:14,374][INFO ][logstash.runner ] Logstash shut down.

This is what the stdout shows:

C:\Elastic\logstash-7.3.1>.\bin\logstash -f .\log_parser\log_parse.conf
Java HotSpot(TM) 64-Bit Server VM warning: Option UseConcMarkSweepGC was depreca
ted in version 9.0 and will likely be removed in a future release.
WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by org.jruby.runtime.encoding.EncodingService
(file:/C:/Elastic/logstash-7.3.1/logstash-core/lib/jars/jruby-complete-9.2.7.0.
jar) to field java.io.Console.cs
WARNING: Please consider reporting this to the maintainers of org.jruby.runtime.
encoding.EncodingService
WARNING: Use --illegal-access=warn to enable warnings of further illegal reflect
ive access operations
WARNING: All illegal access operations will be denied in a future release
Thread.exclusive is deprecated, use Thread::Mutex
Sending Logstash logs to C:/Elastic/logstash-7.3.1/logs which is now configured
via log4j2.properties
[2019-09-02T01:32:55,053][WARN ][logstash.config.source.multilocal] Ignoring the
'pipelines.yml' file because modules or command line options are specified
[2019-09-02T01:32:55,303][INFO ][logstash.runner ] Starting Logstash {"
logstash.version"=>"7.3.1"}
[2019-09-02T01:33:05,156][INFO ][org.reflections.Reflections] Reflections took 3
21 ms to scan 1 urls, producing 19 keys and 39 values
[2019-09-02T01:33:07,527][INFO ][logstash.outputs.elasticsearch] Elasticsearch p
ool URLs updated {:changes=>{:removed=>, :added=>[http://localhost:9200/]}}
[2019-09-02T01:33:07,807][WARN ][logstash.outputs.elasticsearch] Restored connec
tion to ES instance {:url=>"http://localhost:9200/"}
[2019-09-02T01:33:07,860][INFO ][logstash.outputs.elasticsearch] ES Output versi
on determined {:es_version=>7}
[2019-09-02T01:33:07,864][WARN ][logstash.outputs.elasticsearch] Detected a 6.x
and above cluster: the type event field won't be used to determine the documen
t _type {:es_version=>7}
[2019-09-02T01:33:07,903][INFO ][logstash.outputs.elasticsearch] New Elasticsear
ch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//localhost:920
0"]}
[2019-09-02T01:33:07,974][INFO ][logstash.filters.geoip ] Using geoip database
{:path=>"C:/Elastic/logstash-7.3.1/vendor/bundle/jruby/2.5.0/gems/logstash-filt
er-geoip-6.0.1-java/vendor/GeoLite2-City.mmdb"}
[2019-09-02T01:33:08,264][ERROR][logstash.javapipeline ] Pipeline aborted due
to error {:pipeline_id=>"main", :exception=>#<Grok::PatternError: pattern %{COL
UMN:email} not defined>, :backtrace=>["C:/Elastic/logstash-7.3.1/vendor/bundle/j
ruby/2.5.0/gems/jls-grok-0.11.5/lib/grok-pure.rb:123:in block in compile'", "or g/jruby/RubyKernel.java:1425:inloop'", "C:/Elastic/logstash-7.3.1/vendor/bundl
e/jruby/2.5.0/gems/jls-grok-0.11.5/lib/grok-pure.rb:93:in compile'", "C:/Elasti c/logstash-7.3.1/vendor/bundle/jruby/2.5.0/gems/logstash-filter-grok-4.1.1/lib/l ogstash/filters/grok.rb:274:inblock in register'", "org/jruby/RubyArray.java:1
792:in each'", "C:/Elastic/logstash-7.3.1/vendor/bundle/jruby/2.5.0/gems/logsta sh-filter-grok-4.1.1/lib/logstash/filters/grok.rb:268:inblock in register'", "
org/jruby/RubyHash.java:1419:in each'", "C:/Elastic/logstash-7.3.1/vendor/bundl e/jruby/2.5.0/gems/logstash-filter-grok-4.1.1/lib/logstash/filters/grok.rb:263:i nregister'", "org/logstash/config/ir/compiler/AbstractFilterDelegatorExt.java:
56:in register'", "C:/Elastic/logstash-7.3.1/logstash-core/lib/logstash/java_pi peline.rb:192:inblock in register_plugins'", "org/jruby/RubyArray.java:1792:in
each'", "C:/Elastic/logstash-7.3.1/logstash-core/lib/logstash/java_pipeline.rb :191:inregister_plugins'", "C:/Elastic/logstash-7.3.1/logstash-core/lib/logsta
sh/java_pipeline.rb:463:in maybe_setup_out_plugins'", "C:/Elastic/logstash-7.3. 1/logstash-core/lib/logstash/java_pipeline.rb:204:instart_workers'", "C:/Elast
ic/logstash-7.3.1/logstash-core/lib/logstash/java_pipeline.rb:146:in run'", "C: /Elastic/logstash-7.3.1/logstash-core/lib/logstash/java_pipeline.rb:105:inbloc
k in start'"], :thread=>"#<Thread:0x7ba0a10b run>"}
[2019-09-02T01:33:07,966][INFO ][logstash.outputs.elasticsearch] Using default m
apping template
[2019-09-02T01:33:08,347][INFO ][logstash.outputs.elasticsearch] Attempting to i
nstall template {:manage_template=>{"index_patterns"=>"logstash-", "version"=>6
0001, "settings"=>{"index.refresh_interval"=>"5s", "number_of_shards"=>1, "index
.lifecycle.name"=>"logstash-policy", "index.lifecycle.rollover_alias"=>"logstash
"}, "mappings"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message
", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}
, {"string_fields"=>{"match"=>"
", "match_mapping_type"=>"string", "mapping"=>{"
type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore
_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{
"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip
"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "long
itude"=>{"type"=>"half_float"}}}}}}}
[2019-09-02T01:33:08,359][ERROR][logstash.agent ] Failed to execute ac
tion {:id=>:main, :action_type=>LogStash::ConvergeResult::FailedAction, :message
=>"Could not execute action: PipelineAction::Create, action_result: false"
, :backtrace=>nil}
[2019-09-02T01:33:08,770][INFO ][logstash.agent ] Successfully started
Logstash API endpoint {:port=>9600}
[2019-09-02T01:33:14,374][INFO ][logstash.runner ] Logstash shut down.

It is not finding the definition of COLUMN. Are you using a file or

 pattern_definitions => {
      "SPACE" => "([A-Za-z0-9]\S*|-\S*|/\S*)"
      "QUOTE" => '"[^"] "'
      "COLUMN" => "^[^:]\s*"
 }

I was indeed using a file, but I tried with pattern_definitions, and logstash does not shut down after starting it anymore.

But it looks like it doesn't read the logs from the files it should be reading.

This is my current config file:

input{

file {
    	path => "E:\\DATA\test\*.log"
	start_position => "beginning"
  	}

}

filter{

if [path] == "E:\\DATA\\test\\bluecoat.log"{
	grok{

		match => { "message" => "%{SPACE:date} %{SPACE:time} %{SPACE:time_taken} %{SPACE:c_ip} %{SPACE:cs_username} %{SPACE:cs_auth_group} %{SPACE:x_exception_id} %{SPACE:sc_filter_result} %{QUOTE:cs_categories} %{SPACE:cs_referer}  %{SPACE:sc_status} %{SPACE:s_action} %{SPACE:cs_method} %{SPACE:rs_Content_Type} %{SPACE:cs_uri_scheme} %{SPACE:cs_host} %{SPACE:cs_uri_port} %{SPACE:cs_uri_path} %{SPACE:cs-uri-query} %{SPACE:cs_uri_extension} %{QUOTE:cs_User_Agent} %{SPACE:s_ip} %{SPACE:sc_bytes} %{SPACE:cs_bytes} %{SPACE:x_virus_id} %{QUOTE:x_bluecoat_application_name} %{QUOTE:x_bluecoat_application_operation} %{SPACE:cs_auth_type} %{SPACE:x_auth_credential_type} %{SPACE:r_ip}" }

		pattern_definitions => {
      			"SPACE" => "([A-Za-z0-9]\S*|-\S*|/\S*)"
  				"QUOTE" => '"[^"] "'
  				"COLUMN" => "^[^:]\s*"
    		}

	}

	geoip {
  		 source => "s_ip"
		}

}

else if [path] == "E:\\DATA\\test\\q"{

	grok{

		match => { "message" => "%{COLUMN:email}:%{SPACE:pass}" }

		pattern_definitions => {
      			"SPACE" => "([A-Za-z0-9]\S*|-\S*|/\S*)"
  				"QUOTE" => '("[^"] "|-\S*)'
  				"COLUMN" => "^[^:]\s*"
    		}

	}
}

if "_grokparsefailure" in [tags] {
drop { }
}

}

output {

elasticsearch { hosts => ["localhost:9200"] }
	stdout { codec => rubydebug }

}

I tried using (\) on the file paths, but then it gives me the following error.

Pipeline aborted due to error {:pipeline_id=>"main", :exception=>#<ArgumentError: File paths must be absolute, relative path specified: E:\DATA\test\bluecoat.log>

Any suggestions?

Thanks.

Use forward slash, not backslash, in the path option of a file input on Windows.

Yeah that fixed it.

Everithing is correct now.

Thanks for the support.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.