Filebeats to ship Node Application logs

Hello People,

I was wondering if there is any way i could ship the logs from my node app to elasticsearch. I did try doing it but for some reason logs wont show up which are mentioned in the filebeat.yml.
Pretty sure it is something to do with the modules but how.
Thanks in advance.

@adityak248 Can you share your configuration?

Alright I found new issues while working on it.
Logstash seems to be working but when I did netstat | grep 5044 not his is coming up.
my logstash.yml is pretty straight default and here are my configs
input:
input {
beats {
port => 5044
}
}
fileter:
filter {
if [fileset][module] == "system" {
if [fileset][name] == "auth" {
grok {
match => { "message" => ["%{SYSLOGTIMESTAMP:[system][auth][timestamp]} %{SYSLOGHOST:[system][auth][hostname]} sshd(?:[%{POSINT:[system][auth][pid]}])?: %{DATA:[system][auth][ssh][event]} %{DATA:[system][auth][ssh][method]} for (invalid user )?%{DATA:[system][auth][user]} from %{IPORHOST:[system][auth][ssh][ip]} port %{NUMBER:[system][auth][ssh][port]} ssh2(: %{GREEDYDATA:[system][auth][ssh][signature]})?",
"%{SYSLOGTIMESTAMP:[system][auth][timestamp]} %{SYSLOGHOST:[system][auth][hostname]} sshd(?:[%{POSINT:[system][auth][pid]}])?: %{DATA:[system][auth][ssh][event]} user %{DATA:[system][auth][user]} from %{IPORHOST:[system][auth][ssh][ip]}",
"%{SYSLOGTIMESTAMP:[system][auth][timestamp]} %{SYSLOGHOST:[system][auth][hostname]} sshd(?:[%{POSINT:[system][auth][pid]}])?: Did not receive identification string from %{IPORHOST:[system][auth][ssh][dropped_ip]}",
"%{SYSLOGTIMESTAMP:[system][auth][timestamp]} %{SYSLOGHOST:[system][auth][hostname]} sudo(?:[%{POSINT:[system][auth][pid]}])?: \s*%{DATA:[system][auth][user]} :frowning: %{DATA:[system][auth][sudo][error]} ;)? TTY=%{DATA:[system][auth][sudo][tty]} ; PWD=%{DATA:[system][auth][sudo][pwd]} ; USER=%{DATA:[system][auth][sudo][user]} ; COMMAND=%{GREEDYDATA:[system][auth][sudo][command]}",
"%{SYSLOGTIMESTAMP:[system][auth][timestamp]} %{SYSLOGHOST:[system][auth][hostname]} groupadd(?:[%{POSINT:[system][auth][pid]}])?: new group: name=%{DATA:system.auth.groupadd.name}, GID=%{NUMBER:system.auth.groupadd.gid}",
"%{SYSLOGTIMESTAMP:[system][auth][timestamp]} %{SYSLOGHOST:[system][auth][hostname]} useradd(?:[%{POSINT:[system][auth][pid]}])?: new user: name=%{DATA:[system][auth][user][add][name]}, UID=%{NUMBER:[system][auth][user][add][uid]}, GID=%{NUMBER:[system][auth][user][add][gid]}, home=%{DATA:[system][auth][user][add][home]}, shell=%{DATA:[system][auth][user][add][shell]}$",
"%{SYSLOGTIMESTAMP:[system][auth][timestamp]} %{SYSLOGHOST:[system][auth][hostname]} %{DATA:[system][auth][program]}(?:[%{POSINT:[system][auth][pid]}])?: %{GREEDYMULTILINE:[system][auth][message]}"] }
pattern_definitions => {
"GREEDYMULTILINE"=> "(.|\n)"
}
remove_field => "message"
}
date {
match => [ "[system][auth][timestamp]", "MMM d HH:mm:ss", "MMM dd HH:mm:ss" ]
}
geoip {
source => "[system][auth][ssh][ip]"
target => "[system][auth][ssh][geoip]"
}
}
else if [fileset][name] == "syslog" {
grok {
match => { "message" => ["%{SYSLOGTIMESTAMP:[system][syslog][timestamp]} %{SYSLOGHOST:[system][syslog][hostname]} %{DATA:[system][syslog][program]}(?:[%{POSINT:[system][syslog][pid]}])?: %{GREEDYMULTILINE:[system][syslog][message]}"] }
pattern_definitions => { "GREEDYMULTILINE" => "(.|\n)
" }
remove_field => "message"
}
date {
match => [ "[system][syslog][timestamp]", "MMM d HH:mm:ss", "MMM dd HH:mm:ss" ]
}
}
}
}
{
...
"@metadata": {
"beat": "apm-server",
"version": "6.5.4"
"type": "doc"
}
}

and output:
output {
elasticsearch {
hosts => ["localhost:9200"]
manage_template => false
index => "%{[@metadata][beat]}-%{[@metadata][version]}-%{+YYYY.MM.dd}"
document_type => "system_logs"
}
stdout { codec => rubydebug }
}
################
Status:

logstash.service - logstash
Loaded: loaded (/etc/systemd/system/logstash.service; enabled; vendor preset: disabled)
Active: active (running) since Wed 2019-01-09 19:18:53 UTC; 1min 21s ago
Main PID: 10435 (java)
CGroup: /system.slice/logstash.service
└─10435 /bin/java -Xms1g -Xmx1g -XX:+UseParNewGC -XX:+UseConcMarkSweepGC -XX:CMSInitiatingOccupancyFraction=75 -XX:+UseCMSInitiatingOccupancyOnly -Djava.awt.he...

Jan 09 19:18:53 Elk systemd[1]: Started logstash.
###########################
when i check with netstat I no where see it running.
Port 5044 is open.
No firewall restrictions.

Please let me know on how to launch logstash to listen from remote host with filebeat.

@adityak248 This look more like a Logstash problem than a Filebeat one. Did you look at the Logstash's logs to see if any errors are present?

Here is the error it posted
[ERROR][logstash.agent ] Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"LogStash::ConfigurationError", :message=>"Expected one of #, input, filter, output at line 43, column 1 (byte 3192) after ", :backtrace=>["/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:41:in compile_imperative'", "/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:49:incompile_graph'", "/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:11:in block in compile_sources'", "org/jruby/RubyArray.java:2486:inmap'", "/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:10:in compile_sources'", "org/logstash/execution/AbstractPipelineExt.java:149:ininitialize'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:22:in initialize'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:90:ininitialize'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline_action/create.rb:42:in block in execute'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:92:inblock in exclusive'", "org/jruby/ext/thread/Mutex.java:148:in synchronize'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:92:inexclusive'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline_action/create.rb:38:in execute'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:317:inblock in converge_state'"]}
[2019-01-09T20:04:17,533][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}

I think you have a malformed configuration and probably missing a curly brace near the line 43, since the config that you posted in this post is not a verbatim representation of the config I cannot be more precise.

Hey @pierhugues

When it is saying configuration error , which file is it referencing to? is it pipelines.yml or logstash.yml or the config.d content?
Also what port does it work on, I don't see 5044 from netstat output.

Its referencing the actual pipeline in config.d/, the port don't show up in netstat because logstash failed to parse the pipeline configuration.

Hey @pierhugues

So coming back to the main issue, how do I ask filebeat to ship my application logs(node).
Should i be creating a module?
Any input here will help.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.