sgladkovich
(Sergey Gladkovich)
November 20, 2017, 10:24pm
1
Hello,
I have a single-node ElasticStack 5.6.3 on Ubuntu 14.04 in AWS, and I'm trying to get AWS RDS logs sent to it. I have a script (on the same node) that downloads the log files with no issues.
Here's my Logstash input configuration:
input {
file {
id => "input-file-rds-postgresql"
path => [ "/var/local/logstash/rds/*/error/*.log.*" ]
add_field => { "type" => "rds-postgresql" }
}
}
It's followed by this filter:
filter {
if [fields][type] == "rds-postgresql" {
grok {
id => "filter-grok-rds-postgresql"
patterns_dir => "/etc/logstash/patterns"
patterns_files_glob => "*.conf"
match => { "message" => [ "%{PGLOG}" ] }
}
date {
match => [ "timestamp", "YYYY-MM-dd HH:mm:ss z", "YYYY-MM-dd HH:mm:ss" ]
}
}
}
Here's the custom pattern file for the above:
PGUSERNAME [a-zA-Z0-9\-\.]+
PGDBNAME [a-zA-Z0-9\-\.]+
PGDATE %{YEAR}-%{MONTHNUM}-%{MONTHDAY} %{TIME}
PGLOGLEVEL (?:DEBUG(?:\d)?|INFO|NOTICE|WARNING|ERROR|LOG|FATAL|PANIC)
PGLOG %{PGDATE:timestamp} %{TZ:timezone}\:(?:%{IPORHOST:[pg][remote_host]}\(%{POSINT:[pg][remote_port]:int}\))?\:(?:%{PGUSERNAME:[pg][user]})?\@(?:%{PGDBNAME:[pg][instance]})?\:\[(?:%{POSINT:[pg][pid]:int})?\]\:%{PGLOGLEVEL:[pg][severity]}\: %{GREEDYDATA:[pg][message]}
The pattern has been tested against the log contents in Grok debugger.
As I download the logs I see Logstash creating a sincedb file in /var/lib/logstash/plugins/inputs/file/; I spot-checked a few inodes and indeed the sincedb file seems to reflect Logstash picking up the files that I need. Running ls on the path pattern in the input filter shows the files in question.
Now, when I go to Kibana and look for
fields.type:rds-postgresql
I get no results.
It seems to me that I'm missing something very basic, but so far I haven't been able to spot it.
Any help or insights would be highly appreciated.
Thank you,
Sergey
As you are setting the type
field in the file input, I would expect it to be at the root level rather than fields.type
.
sgladkovich
(Sergey Gladkovich)
November 21, 2017, 9:58am
3
Thanks for responding - that makes sense. I changed the filter part to read
if [type] == "rds-postgresql" {
but still nothing in Elasticsearch.
My understanding from other portions of Elastic documentation was that "add_field" doesn't do it at the root level, but instead adds it to "fields" - please correct me if I'm mistaken here. Also, I specifically avoid setting the root-level "type" as I understand it's been deprecated.
Another note: I have beats input configured on the same node with fields.type added by Filebeat (from various sources) and they work just fine.
Print the events to stdout using a rubydebug codec. That will allow you to see what the final event looks like and will make troubleshooting this a lot easier.
sgladkovich
(Sergey Gladkovich)
November 21, 2017, 10:59pm
5
Christian_Dahlqvist, thanks again. I have added this output:
output {
stdout {
id => "output-stdout"
codec => "rubydebug"
}
}
but now Logstash won't start at all The error is
[2017-11-21T22:54:00,478][FATAL][logstash.runner ] An unexpected error occurred! {:error=>#<NameError: undefined local variable or method `dotfile' for #<AwesomePrint::Inspector:0x48a1f44f>>, :backtrace=>["/usr/share/logstash/vendor/bundle/jruby/1.9/gems/awesome_print-1.8.0/lib/awesome_print/inspector.rb:163:in `merge_custom_defaults!'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/awesome_print-1.8.0/lib/awesome_print/inspector.rb:50:in `initialize'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/awesome_print-1.8.0/lib/awesome_print/core_ext/kernel.rb:9:in `ai'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-codec-rubydebug-3.0.4/lib/logstash/codecs/rubydebug.rb:39:in `encode_default'", "org/jruby/RubyMethod.java:120:in `call'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-codec-rubydebug-3.0.4/lib/logstash/codecs/rubydebug.rb:35:in `encode'", "/usr/share/logstash/logstash-core/lib/logstash/codecs/base.rb:50:in `multi_encode'", "org/jruby/RubyArray.java:1613:in `each'", "/usr/share/logstash/logstash-core/lib/logstash/codecs/base.rb:50:in `multi_encode'", "/usr/share/logstash/logstash-core/lib/logstash/outputs/base.rb:90:in `multi_receive'", "/usr/share/logstash/logstash-core/lib/logstash/output_delegator_strategies/single.rb:15:in `multi_receive'", "org/jruby/ext/thread/Mutex.java:149:in `synchronize'", "/usr/share/logstash/logstash-core/lib/logstash/output_delegator_strategies/single.rb:14:in `multi_receive'", "/usr/share/logstash/logstash-core/lib/logstash/output_delegator.rb:49:in `multi_receive'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:434:in `output_batch'", "org/jruby/RubyHash.java:1342:in `each'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:433:in `output_batch'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:381:in `worker_loop'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:342:in `start_workers'"]}
Not sure what else to try at this point.
Best regards,
Sergey
I don’t think the rubydebug should be quoted.
sgladkovich
(Sergey Gladkovich)
November 21, 2017, 11:13pm
7
Thank you! Trying again...
sgladkovich
(Sergey Gladkovich)
November 21, 2017, 11:20pm
8
No luck New config:
output {
stdout {
id => "output-stdout"
codec => rubydebug
}
}
Same error...
I do not see how that error is necessarily related to the stdout output plugin. Can you show the rest of your config?
sgladkovich
(Sergey Gladkovich)
November 22, 2017, 8:53am
10
The total config is about 150 lines with comments excluded. Is it OK to post it in a reply here? Also, please note: Logstash starts up fine without the output stdout section. In any case, I truly appreciate your help.
sgladkovich
(Sergey Gladkovich)
November 22, 2017, 6:03pm
11
Christian_Dahlqvist, here's the full config that works (with the exception of rds-postgresql filter):
$ cat /etc/logstash/conf.d/*.conf | grep -v '^#'
input {
beats {
port => 5044
}
}
input {
file {
id => "input-file-rds-postgresql"
path => [ "/var/local/logstash/rds/*/error/*.log.*" ]
add_field => { "type" => "rds-postgresql" }
}
}
filter {
grok {
id => "filter-grok-asg-color"
match => ["host", "[a-z0-9]+-web-(?<asg_color>[a-z]+)-[a-z0-9]{3}"]
}
}
filter {
grok {
id => "filter-grok-environment"
match => ["host", "(?<environment>[a-z0-9]+)"]
}
}
filter {
if [fields][type] == "authlog" {
grok {
id => "filter-grok-authlog"
match => { "message" => ["%{SYSLOGTIMESTAMP:[system][auth][timestamp]} %{SYSLOGHOST:[system][auth][hostname]} sshd(?:\[%{POSINT:[system][auth][pid]}\])?: %{DATA:[system][auth][ssh][event]} %{DATA:[system][auth][ssh][method]} for (invalid user )?%{DATA:[system][auth][user]} from %{IPORHOST:[system][auth][ssh][ip]} port %{NUMBER:[system][auth][ssh][port]} ssh2(: %{GREEDYDATA:[system][auth][ssh][signature]})?",
"%{SYSLOGTIMESTAMP:[system][auth][timestamp]} %{SYSLOGHOST:[system][auth][hostname]} sshd(?:\[%{POSINT:[system][auth][pid]}\])?: %{DATA:[system][auth][ssh][event]} user %{DATA:[system][auth][user]} from %{IPORHOST:[system][auth][ssh][ip]}",
"%{SYSLOGTIMESTAMP:[system][auth][timestamp]} %{SYSLOGHOST:[system][auth][hostname]} sshd(?:\[%{POSINT:[system][auth][pid]}\])?: Did not receive identification string from %{IPORHOST:[system][auth][ssh][dropped_ip]}",
"%{SYSLOGTIMESTAMP:[system][auth][timestamp]} %{SYSLOGHOST:[system][auth][hostname]} sudo(?:\[%{POSINT:[system][auth][pid]}\])?: \s*%{DATA:[system][auth][user]} :( %{DATA:[system][auth][sudo][error]} ;)? TTY=%{DATA:[system][auth][sudo][tty]} ; PWD=%{DATA:[system][auth][sudo][pwd]} ; USER=%{DATA:[system][auth][sudo][user]} ; COMMAND=%{GREEDYDATA:[system][auth][sudo][command]}",
"%{SYSLOGTIMESTAMP:[system][auth][timestamp]} %{SYSLOGHOST:[system][auth][hostname]} groupadd(?:\[%{POSINT:[system][auth][pid]}\])?: new group: name=%{DATA:system.auth.groupadd.name}, GID=%{NUMBER:system.auth.groupadd.gid}",
"%{SYSLOGTIMESTAMP:[system][auth][timestamp]} %{SYSLOGHOST:[system][auth][hostname]} useradd(?:\[%{POSINT:[system][auth][pid]}\])?: new user: name=%{DATA:[system][auth][user][add][name]}, UID=%{NUMBER:[system][auth][user][add][uid]}, GID=%{NUMBER:[system][auth][user][add][gid]}, home=%{DATA:[system][auth][user][add][home]}, shell=%{DATA:[system][auth][user][add][shell]}$",
"%{SYSLOGTIMESTAMP:[system][auth][timestamp]} %{SYSLOGHOST:[system][auth][hostname]} %{DATA:[system][auth][program]}(?:\[%{POSINT:[system][auth][pid]}\])?: %{GREEDYMULTILINE:[system][auth][message]}"]
}
pattern_definitions => {
"GREEDYMULTILINE"=> "(.|\n)*"
}
remove_field => "message"
}
date {
match => [ "[system][auth][timestamp]", "MMM d HH:mm:ss", "MMM dd HH:mm:ss" ]
}
}
}
filter {
if [fields][type] == "celery" {
grok {
id => "filter-grok-celery"
patterns_dir => "/etc/logstash/patterns"
patterns_files_glob => "*.conf"
match => { "message" => "%{CELERY}" }
}
date {
match => [ "timestamp", "YYYY-MM-dd HH:mm:ss,SSS" ]
}
}
}
filter {
if [fields][type] == "codedeploy" {
grok {
id => "filter-grok-codedeploy"
patterns_dir => "/etc/logstash/patterns"
patterns_files_glob => "*.conf"
break_on_match => false
match => { "message" => [ "%{CODEDEPLOYMARK}", "%{CODEDEPLOY}", "%{CODEDEPLOYSCRIPT}" ] }
}
date {
match => [ "timestamp", "YYYY-MM-dd HH:mm:ss.SSS", "YYYY-MM-dd HH:mm:ss Z" ]
}
}
}
filter {
if [fields][type] == "rds-postgresql" {
grok {
id => "filter-grok-rds-postgresql"
patterns_dir => "/etc/logstash/patterns"
patterns_files_glob => "*.conf"
match => { "message" => [ "%{PGLOG}" ] }
}
date {
match => [ "timestamp", "YYYY-MM-dd HH:mm:ss z", "YYYY-MM-dd HH:mm:ss" ]
}
}
}
filter {
if [fields][type] == "site-backend" {
grok {
id => "filter-grok-site-backend"
patterns_dir => "/etc/logstash/patterns"
patterns_files_glob => "*.conf"
match => { "message" => "%{SITEBACKEND}" }
}
}
}
filter {
if [fields][type] == "site-error" {
grok {
id => "filter-grok-site-error"
patterns_dir => "/etc/logstash/patterns"
patterns_files_glob => "*.conf"
match => { "message" => "%{SITEERROR}" }
}
date {
match => [ "timestamp", "YYYY/MM/dd HH:mm:ss" ]
}
}
}
filter {
if [fields][type] == "site-static" {
grok {
id => "filter-grok-site-static"
patterns_dir => "/etc/logstash/patterns"
patterns_files_glob => "*.conf"
match => { "message" => "%{SITESTATIC}" }
}
}
}
filter {
if [fields][type] == "syslog" {
grok {
id => "filter-grok-syslog"
match => { "message" => "%{SYSLOGTIMESTAMP:syslog_timestamp} %{SYSLOGHOST:syslog_hostname} %{DATA:syslog_program}(?:\[%{POSINT:syslog_pid}\])?: %{GREEDYDATA:syslog_message}" }
add_field => [ "received_at", "%{@timestamp}" ]
add_field => [ "received_from", "%{host}" ]
}
date {
match => [ "syslog_timestamp", "MMM d HH:mm:ss", "MMM dd HH:mm:ss" ]
}
}
}
filter {
if [fields][type] == "uwsgi" {
grok {
id => "filter-grok-uwsgi"
patterns_dir => "/etc/logstash/patterns"
patterns_files_glob => "*.conf"
match => { "message" => [ "%{UWSGI}", "%{UWSGILOG}" ] }
}
date {
match => [ "timestamp", "[EEE MMM dd HH:mm:ss yyyy]", "[EEE MMM d HH:mm:ss yyyy]", "yyyy-dd-MM HH:mm:ss" ]
}
}
}
output {
elasticsearch {
hosts => [ "localhost:9200" ]
index => "%{[@metadata][beat]}-%{+YYYY.MM.dd}"
template_overwrite => "true"
}
}
$
system
(system)
Closed
December 20, 2017, 6:04pm
12
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.