Not able to run logstash on RHEL server

Hi,

I have successfully installed logstash. I have also uploaded config file under "/etc/logstash/conf.d" and tried starting logstash with below command:

[root@localhost Softwares]# sudo rpm -i logstash-6.0.0.rpm
warning: logstash-6.0.0.rpm: Header V4 RSA/SHA512 Signature, key ID d88e42b4: NOKEY
Using provided startup.options file: /etc/logstash/startup.options
Successfully created system startup script for Logstash
[root@localhost Softwares]# sudo initctl start logstash
logstash start/running, process 8411

I am trying to push data to elasticsearch but I am unable to create index on Kibana and getting below logstash logs:

below are the logs from /var/log/logstash/logstash-plain.log

[2018-02-28T10:00:10,617][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil}
[2018-02-28T10:00:10,631][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-*", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"_default_"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
[2018-02-28T10:00:10,936][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["https://xxxxxxxxxx.us-east-1.aws.found.io:443"]}
[2018-02-28T10:00:11,080][INFO ][logstash.pipeline        ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>1, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>5, "pipeline.max_inflight"=>125, :thread=>"#<Thread:0x67187da8@/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:290 run>"}
[2018-02-28T10:00:11,428][INFO ][logstash.pipeline        ] Pipeline started {"pipeline.id"=>"main"}
[2018-02-28T10:00:11,466][INFO ][logstash.agent           ] Pipelines running {:count=>1, :pipelines=>["main"]}

Please help me in resolving this issue.

And what do you have in the file in the conf.d directory?

Hi @magnusbaeck, I have kept config file under conf.d directory which parses my logs.
There is no problem in parsing data. I am able to parse and send data to elasticsearch successfully with the same config file in windows OS. I am following the correct procedure for RHEL but not able to parse/send data to elasticsearch.

Can you share your logstash config file that you had placed inside /etc/logstash/conf.d ?

Hi @Makra, @magnusbaeck, please find below config file:

input 
{
	file 
	{
		path => "/xxxxx/logs/server.log"
		start_position => beginning	
		ignore_older => 0
	}
}

filter 
{
    grok 
	{
		match => { "message" => "%{TIMESTAMP_ISO8601:logTimestamp} %{DATA:timezone} %{DATA:wmcode} %{GREEDYDATA:syslog_message}" }
		tag_on_failure => [ ]	
    }
	
	ruby 
	{
		code => "						
				apparr = event['wmcode'].split('.')
				event['arrlen'] = apparr.length
		"
	}
	if [arrlen] == 3
	{
		ruby 
		{
			code => "
					apptemp = event['wmcode'].split('.').first
					event['app'] = apptemp[1..apptemp.length]
						
					logInfo = event['wmcode'].split('.').last
					event['logLevel'] = logInfo[logInfo.length-2..logInfo.length-2]
			"
		}
	}
	if [arrlen] != 3
	{
		ruby 
		{
			code => "
					event['app']=event['wmcode'][1..event['wmcode'].length-2]
					logInfo = event['wmcode'].split('.').last
					event['logLevel'] = logInfo[logInfo.length-2..logInfo.length-2]
			"
		}
	}
	date 
	{
            locale => "en"
            match => ["logTimestamp", "YYYY-MM-dd HH:mm:ss"]
            target => "logTimestamp"			
    }  
	mutate
	{
		 remove_field => [ "@timestamp","arrlen","@version","message" ]
	}
}

output 
{
	elasticsearch 
	{
		hosts => "https://xxxxxx.us-east-1.aws.found.io:443"
		user => "xxxxx"
		password => "xxxxxx"
		index => "loganalysis"
	}
	stdout 
	{  	
		codec => rubydebug 
	}
}

Assuming the same config had worked in windows, check the path of the log file, add sincedb_path => "/tmp/sincedb" in the input section and find if logstash has written something on it or not.

Also can you patste the debug log ? Currently the debug mode is info. You can change that in the logstash config file (logstash.yml)

Does the ES instance runs on port 443 ? I think it should be 9200

Hi @Makra,
Yes the same config was working in windows.
Yes I have checked and 443 is working port for ES.
I will add sincedb_path => "/tmp/sincedb" in the input section
I will also change debug mode and let you know.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.