Can someone please help me with nginx logstash filter

Hi Guys,

Can someone please help me to identify the issue with this filter, I am trying to parse the nginx logs with file input and getting below error.

can somone please help me to rectify the error?

file {
{
type => nginx_web
path => ["/var/nginx/"]
exclude => ["
.gz"]
}
}

filter {
grok {
match => [ "message" , "%{COMBINEDAPACHELOG}+%{GREEDYDATA:extra_fields}"]
overwrite => [ "message" ]
}

mutate {
convert => ["response", "integer"]
convert => ["bytes", "integer"]
convert => ["responsetime", "float"]
}

geoip {
source => "clientip"
target => "geoip"
add_tag => [ "nginx-geoip" ]
}

date {
match => [ "timestamp" , "dd/MMM/YYYY:HH:mm:ss Z" ]
remove_field => [ "timestamp" ]
}

useragent {
source => "agent"
}
}

output {
elasticsearch {
manage_template => false
hosts => "192.168.5.15:9200"
index => "nginx-%{+YYYY.MM.dd}"

}
stdout {
codec => "rubydebug"
}

}

And here is the error

[root@elk5 /etc/logstash/conf.d-ELK5.x]# /usr/share/logstash/bin/logstash -f newnginx.conf
ERROR StatusLogger No log4j2 configuration file found. Using default configuration: logging only errors to the console.
WARNING: Could not find logstash.yml which is typically located in $LS_HOME/config or /etc/logstash. You can specify the path using --path.settings. Continuing using the defaults
Could not find log4j2 configuration at path //usr/share/logstash/config/log4j2.properties. Using default config which logs to console
19:05:30.001 [LogStash::Runner] ERROR logstash.agent - Cannot create pipeline {:reason=>"Expected one of #, input, filter, output at line 1, column 1 (byte 1) after "}

Your file input needs to be wrapped in input { ... }.

hmm...any idea how to decode this one :frowning:

ERROR StatusLogger No log4j2 configuration file found. Using default configuration: logging only errors to the console.
WARNING: Could not find logstash.yml which is typically located in $LS_HOME/config or /etc/logstash. You can specify the path using --path.settings. Continuing using the defaults
Could not find log4j2 configuration at path //usr/share/logstash/config/log4j2.properties. Using default config which logs to console
20:13:09.859 [LogStash::Runner] ERROR logstash.agent - Cannot create pipeline {:reason=>"Expected one of #, => at line 11, column 10 (byte 154) after filter {\n grok {\n match "}

input {
file {
path => "/var/log/*.log"
start_position => "beginning"
}

}

filter {
grok {
match => [ "message" , "%{COMBINEDAPACHELOG}+%{GREEDYDATA:extra_fields}"]
overwrite => [ "message" ]
}

mutate {
convert => ["response", "integer"]
convert => ["bytes", "integer"]
convert => ["responsetime", "float"]
}

geoip {
source => "clientip"
target => "geoip"
add_tag => [ "nginx-geoip" ]
}

date {
match => [ "timestamp" , "dd/MMM/YYYY:HH:mm:ss Z" ]
remove_field => [ "timestamp" ]
}

Seems I got the issue and fixed; I am processing the logs...

Lets see how the stuff goes :slight_smile:

Hi Guys,

I am really struggling with Nginx logs and after being spent almost more than 2 weeks I am running out of ideas.

Can someone plssss help indexing nginx logs! Dang stuff is not working out for me :frowning:

I would really appreciate if someone can provide me the ideal nginx logstash config file..plsss

I cleaned up / corrected you conf. There where some { misplaced.

input {
	file {
	
		type => nginx_web
		path => ["/var/nginx/"]
		exclude => [".gz"]
	}
}

filter {
	grok {
		match => [ “message” , “%{COMBINEDAPACHELOG}+%{GREEDYDATA:extra_fields}”]
		overwrite => [ “message” ]
	}

	mutate {
		convert => [“response”, “integer”]
		convert => [“bytes”, “integer”]
		convert => [“responsetime”, “float”]
	}

	geoip {
		source => "clientip"
		target => "geoip"
		add_tag => [ “nginx-geoip” ]
	}

	date {
		match => [ “timestamp” , “dd/MMM/YYYY:HH:mm:ss Z” ]
		remove_field => [ “timestamp” ]
	}	

	useragent {
		source => “agent”
	}
}

output {
	elasticsearch {
		manage_template => false
		hosts => "192.168.5.15:9200"
		index => “nginx-%{+YYYY.MM.dd}”
	}
	stdout {
		codec => “rubydebug”
	}
}

Great thanks let me run it and let you know my results :slight_smile:

I am accepting files using filebeat hope this is ok? That is filebeat is installed on nginx server

I am unsure what you're asking here. I do not know what config filebeat accepts but this config is for logstash.

Well I am confused about accepting messages whether that should be through filebeat or rsyslog directly sending to logstash.

Which is the most feasible way per you?

When I can I use rsyslog / syslog-ng to send the data over to logstash, if an application is not able to use a syslog variant I use filebeat to parse the logs and send it to logstash.

Does that help?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.