Rsyslog to send logs to logstash

Hi,
Configured rsyslog to send logs to logstash.
But kibana dosen't receive.
Please help.

OS : Ubuntu 20.04
</>
cat /etc/elasticsearch/elasticsearch.yml

Elasticsearch performs poorly when the system is swapping the memory.

---------------------------------- Network -----------------------------------

Set the bind address to a specific IP (IPv4 or IPv6):

#network.host: localhost
network.host: 172.20.111.199

Set a custom port for HTTP:

http.port: 9200

For more information, consult the network module documentation.

cat /etc/logstash/conf.d/02-beats-input.conf

This input block will listen on port 10514 for logs to come in.

host should be an IP on the Logstash server.

codec => "json" indicates that we expect the lines we're receiving to be in JSON format

type => "rsyslog" is an optional identifier to help identify messaging streams in the pipeline.

input {
udp {
host => "172.20.111.199"
port => 10514
codec => "json"
type => "rsyslog"
}
}

# cat /etc/logstash/conf.d/30-elasticsearch-output.conf

This is an empty filter block. You can later add other filters here to further process

your log lines

filter { }

This output block will send all events of type "rsyslog" to Elasticsearch at the configured

host and port into daily indices of the pattern, "rsyslog-YYYY.MM.DD"

output {
if [type] == "rsyslog" {
elasticsearch {
hosts => [ "172.20.111.199:9200" ]
}
}
}

cat /etc/filebeat/filebeat.yml

------------------------------ Logstash Output -------------------------------

output.logstash:

The Logstash hosts

#hosts: ["172.20.111.199:10514"]
hosts: ["172.20.111.199:5044"]

cat /etc/kibana/kibana.yml

The URLs of the Elasticsearch instances to use for all your queries.

elasticsearch.hosts: ["http://172.20.111.199:9200"]
#elasticsearch.hosts: ["http://localhost:9200"]

root@elk:/etc/rsyslog.d# ls
01-json-template.conf 20-ufw.conf 21-cloudinit.conf 50-default.conf 60-output.conf

# cat 01-json-template.conf

template(name="json-template"
type="list") {
constant(value="{")
constant(value=""@timestamp":"") property(name="timereported" dateFormat="rfc3339")
constant(value="","@version":"1")
constant(value="","message":"") property(name="msg" format="json")
constant(value="","sysloghost":"") property(name="hostname")
constant(value="","severity":"") property(name="syslogseverity-text")
constant(value="","facility":"") property(name="syslogfacility-text")
constant(value="","programname":"") property(name="programname")
constant(value="","procid":"") property(name="procid")
constant(value=""}\n")
}

# cat 60-output.conf

This line sends all lines to defined IP address at port 10514,

using the "json-template" format template

. @172.20.111.199:10514;json-template

cat /var/log/syslog

Feb 21 11:04:02 elk filebeat[4471]: 2021-02-21T11:04:02.296+0530#011ERROR#011[publisher_pipeline_output]#011pipeline/output.go:154#011Failed to connect to backoff(async(tcp://172.20.111.199:5044)): dial tcp 172.20.111.199:5044: connect: connection refused
Feb 21 11:04:02 elk filebeat[4471]: 2021-02-21T11:04:02.296+0530#011INFO#011[publisher_pipeline_output]#011pipeline/output.go:145#011Attempting to reconnect to backoff(async(tcp://172.20.111.199:5044)) with 1168 reconnect attempt(s)

indent preformatted text by 4 spaces

Kibana didnt received any log.

Please help to rectify this issue.

Hi @Prabhath_samarasingh it is very hard to read the yml code

Could you please edit and format all the code in your post above using by selecting it all and using the format button </> above , then perhaps we can help

One thing I do already notice is that you did not define the index in the Logstash output section see here so the data is probably going to

logstash-%{+yyyy.MM.dd}

1 Like

Sorry ..
Can you give some idea to create index for this configuration.
I'm really new to this system.

root@elk:/etc/rsyslog.d# curl '172.20.111.199:9200/_cat/indices?v'

health status index                             uuid                   pri rep docs.count docs.deleted store.size pri.store.size
yellow open   filebeat-7.11.0-2021.02.19        IQWbdWcrR8ONaHOAawN7jA   1   1      17890            0      4.6mb          4.6mb
yellow open   filebeat-7.11.0-2021.02.17        WBpjRiN9TKmqcci3i2VFEg   1   1      78584            0     15.7mb         15.7mb
yellow open   logstash-2021.02.19-000001        fmYY0L9lSC-TzhI64CSKAA   1   1  378891728            0     51.1gb         51.1gb
yellow open   filebeat-7.11.0-2021.02.18        CpKNzF91S2C2Gp48is5KdA   1   1      24552            0      6.5mb          6.5mb
green  open   .apm-agent-configuration          ca-FRHwdTUqasVeWwHkHnw   1   0          0            0       208b           208b
yellow open   filebeat-7.11.0-2021.02.17-000001 RON3HsOhSL2g9luXu04dlQ   1   1          0            0       208b           208b
green  open   .kibana_1                         4M_sIPucR7C0PR0DoPaU-g   1   0       1608           27      2.5mb          2.5mb
green  open   .kibana-event-log-7.11.0-000001   6jbVzcKzQpeHi7KplCjkgA   1   0         13            0     13.5kb         13.5kb
green  open   .apm-custom-link                  iSYao0AUTouxuFFy73N08g   1   0          0            0       208b           208b
green  open   .kibana_task_manager_1            PUfBaYswS1SzmsVwggX9xg   1   0          8         3991    640.8kb        640.8kb
green  open   .async-search                     26Qpix7XR2Gq8DZ-02ik8A   1   0          0            0      3.7kb          3.7kb
yellow open   tutorial                          Iy_3a_QqRiuGxCRdh6bhdA   1   1          1            0      3.8kb          3.8kb
yellow open   logstash-2021.02.20-000002        G9yofbfOQy250jI9UGgIYA   1   1   84345626            0      8.3gb          8.3gb
root@elk:/etc/rsyslog.d#

cat /etc/logstash/conf.d/30-elasticsearch-output.conf

output {
  if [type] == "rsyslog" {
    elasticsearch {
      hosts => [ "172.20.111.199:9200" ]
      index => "rsyslog-%{+YYYY.MM.dd}"
    }
  }
}

cat /var/log/syslog

 Feb 21 12:49:59 elk logstash[12530]: Sending Logstash logs to /var/log/logstash which is now configured via log4j2.properties
Feb 21 12:49:59 elk logstash[12530]: [2021-02-21T12:49:59,813][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"7.11.0", "jruby.version"=>"jruby 9.2.13.0 (2.5.7) 2020-08-03 9a89c94bcc OpenJDK 64-Bit Server VM 11.0.8+10 on 11.0.8+10 +indy +jit [linux-x86_64]"}
Feb 21 12:50:00 elk logstash[12530]: [2021-02-21T12:50:00,411][FATAL][logstash.runner          ] Logstash could not be started because there is already another instance using the configured data directory.  If you wish to run multiple instances, you must change the "path.data" setting.
Feb 21 12:50:00 elk logstash[12530]: [2021-02-21T12:50:00,418][FATAL][org.logstash.Logstash    ] Logstash stopped processing because of an error: (SystemExit) exit
Feb 21 12:50:00 elk logstash[12530]: org.jruby.exceptions.SystemExit: (SystemExit) exit
Feb 21 12:50:00 elk logstash[12530]: #011at org.jruby.RubyKernel.exit(org/jruby/RubyKernel.java:747) ~[jruby-complete-9.2.13.0.jar:?]
Feb 21 12:50:00 elk logstash[12530]: #011at org.jruby.RubyKernel.exit(org/jruby/RubyKernel.java:710) ~[jruby-complete-9.2.13.0.jar:?]
Feb 21 12:50:00 elk logstash[12530]: #011at usr.share.logstash.lib.bootstrap.environment.<main>(/usr/share/logstash/lib/bootstrap/environment.rb:89) ~[?:?]
Feb 21 12:50:00 elk systemd[1]: logstash.service: Main process exited, code=exited, status=1/FAILURE
Feb 21 12:50:00 elk systemd[1]: logstash.service: Failed with result 'exit-code'.
Feb 21 12:50:00 elk systemd[1]: logstash.service: Scheduled restart job, restart counter is at 10.




root@elk:~#
root@elk:~# ps -ef | grep logstash
root        2903    2579  0 13:19 pts/0    00:00:00 grep --color=auto logstash
root@elk:~#
root@elk:~#
root@elk:~#
root@elk:~#
root@elk:~# /usr/share/logstash/bin/logstash --path.settings /etc/logstash -f /etc/logstash/conf.d/logstash.conf
Using JAVA_HOME defined java: /usr/lib/jvm/java-11-openjdk-amd64/
WARNING, using JAVA_HOME while Logstash distribution comes with a bundled JDK
OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
Sending Logstash logs to /var/log/logstash which is now configured via log4j2.properties
[2021-02-21T13:19:40,192][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"7.11.0", "jruby.version"=>"jruby 9.2.13.0 (2.5.7) 2020-08-03 9a89c94bcc OpenJDK 64-Bit Server VM 11.0.10+9-Ubuntu-0ubuntu1.20.04 on 11.0.10+9-Ubuntu-0ubuntu1.20.04 +indy +jit [linux-x86_64]"}
[2021-02-21T13:19:40,721][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2021-02-21T13:19:41,322][INFO ][logstash.config.source.local.configpathloader] No config files found in path {:path=>"/etc/logstash/conf.d/logstash.conf"}
[2021-02-21T13:19:41,349][ERROR][logstash.config.sourceloader] No configuration found in the configured sources.
[2021-02-21T13:19:41,606][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}
[2021-02-21T13:19:46,496][INFO ][logstash.runner          ] Logstash shut down.
[2021-02-21T13:19:46,515][FATAL][org.logstash.Logstash    ] Logstash stopped processing because of an error: (SystemExit) exit
org.jruby.exceptions.SystemExit: (SystemExit) exit
        at org.jruby.RubyKernel.exit(org/jruby/RubyKernel.java:747) ~[jruby-complete-9.2.13.0.jar:?]
        at org.jruby.RubyKernel.exit(org/jruby/RubyKernel.java:710) ~[jruby-complete-9.2.13.0.jar:?]
        at usr.share.logstash.lib.bootstrap.environment.<main>(/usr/share/logstash/lib/bootstrap/environment.rb:89) ~[?:?]
root@elk:~#

Hi @Prabhath_samarasingh

It very hard to tell what you are trying to do ... first if you want our help please format your code with the by selecting the text and the pressing the format button. I did some of that to your last post see how much easier it is to read.

2nd you are just running a bunch of commands ...

You cat this file

cat /etc/logstash/conf.d/30-elasticsearch-output.conf

output {
  if [type] == "rsyslog" {
    elasticsearch {
      hosts => [ "172.20.111.199:9200" ]
      index => "rsyslog-%{+YYYY.MM.dd}"
    }
  }
}

Then you run this command

/usr/share/logstash/bin/logstash --path.settings /etc/logstash -f /etc/logstash/conf.d/logstash.conf

and it says

[2021-02-21T13:19:41,322][INFO ][logstash.config.source.local.configpathloader] No config files found in path {:path=>"/etc/logstash/conf.d/logstash.conf"}

which means that file does not exist or is not a valid logstash configurations file.

I think you are close, but I can not tell where the problem

This looks correct....

output {
  if [type] == "rsyslog" {
    elasticsearch {
      hosts => [ "172.20.111.199:9200" ]
      index => "rsyslog-%{+YYYY.MM.dd}"
    }
  }
}

you can try running this to check this will load the configurationss from the pipelines.yml

/usr/share/logstash/bin/logstash --path.settings /etc/logstash

It also looks like Logstash was already writing to...

logstash-2021.02.20-000002

If not, lets start back at the very beginning...

Please confirm you are trying to run this architecture

rsyslog -> filebeat -> logstash -> elasticsearch

Please run the following and format the results and I will take a look.

cat /etc/logstash/logstash.yml

cat /etc/logstash/pipelines.yml

ls -l /etc/logstash/conf.d

2 Likes

Thanks for your kind help.
Yes I need to run " rsyslog -> filebeat -> logstash -> elasticsearch" architechture.

root@elk:~# /usr/share/logstash/bin/logstash --path.settings /etc/logstash
Using JAVA_HOME defined java: /usr/lib/jvm/java-11-openjdk-amd64/
WARNING, using JAVA_HOME while Logstash distribution comes with a bundled JDK
OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
Sending Logstash logs to /var/log/logstash which is now configured via log4j2.properties
[2021-02-22T09:51:01,745][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"7.11.0", "jruby.version"=>"jruby 9.2.13.0 (2.5.7) 2020-08-03 9a89c94bcc OpenJDK 64-Bit Server VM 11.0.10+9-Ubuntu-0ubuntu1.20.04 on 11.0.10+9-Ubuntu-0ubuntu1.20.04 +indy +jit [linux-x86_64]"}
[2021-02-22T09:51:04,818][INFO ][org.reflections.Reflections] Reflections took 59 ms to scan 1 urls, producing 23 keys and 47 values 
[2021-02-22T09:51:05,747][INFO ][logstash.outputs.elasticsearch][main] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://172.20.111.199:9200/]}}
[2021-02-22T09:51:05,959][WARN ][logstash.outputs.elasticsearch][main] Restored connection to ES instance {:url=>"http://172.20.111.199:9200/"}
[2021-02-22T09:51:06,012][INFO ][logstash.outputs.elasticsearch][main] ES Output version determined {:es_version=>7}
[2021-02-22T09:51:06,018][WARN ][logstash.outputs.elasticsearch][main] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>7}
[2021-02-22T09:51:06,097][INFO ][logstash.outputs.elasticsearch][main] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//172.20.111.199:9200"]}
[2021-02-22T09:51:06,154][INFO ][logstash.outputs.elasticsearch][main] Using a default mapping template {:es_version=>7, :ecs_compatibility=>:disabled}
[2021-02-22T09:51:06,195][INFO ][logstash.javapipeline    ][main] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>32, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>4000, "pipeline.sources"=>["/etc/logstash/conf.d/02-beats-input.conf", "/etc/logstash/conf.d/30-elasticsearch-output.conf"], :thread=>"#<Thread:0x79ca1f67 run>"}
[2021-02-22T09:51:06,257][INFO ][logstash.outputs.elasticsearch][main] Attempting to install template {:manage_template=>{"index_patterns"=>"logstash-*", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s", "number_of_shards"=>1}, "mappings"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}
[2021-02-22T09:51:07,774][INFO ][logstash.javapipeline    ][main] Pipeline Java execution initialization time {"seconds"=>1.57}
[2021-02-22T09:51:07,814][INFO ][logstash.javapipeline    ][main] Pipeline started {"pipeline.id"=>"main"}
[2021-02-22T09:51:07,908][INFO ][logstash.inputs.udp      ][main][a6c77bc59fe85849e4ded46b15bb40ebe99fe2061050875e32b231be0d1e6c94] Starting UDP listener {:address=>"172.20.111.199:10514"}
[2021-02-22T09:51:07,920][INFO ][logstash.agent           ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[2021-02-22T09:51:07,981][INFO ][logstash.inputs.udp      ][main][a6c77bc59fe85849e4ded46b15bb40ebe99fe2061050875e32b231be0d1e6c94] UDP listener started {:address=>"172.20.111.199:10514", :receive_buffer_bytes=>"106496", :queue_size=>"2000"}
[2021-02-22T09:51:08,168][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}
oot@elk:~# cat /etc/logstash/logstash.yml

# ------------ Data path ------------------
path.data: /var/lib/logstash
#
# ------------ Pipeline Settings --------------
pipeline.ordered: auto
#
# ------------ Pipeline Configuration Settings --------------


# ------------ HTTP API Settings -------------

# ------------ Cloud Settings ---------------

# ------------ Queuing Settings --------------

#
# ------------ Dead-Letter Queue Settings --------------
#


# ------------ Metrics Settings --------------
0
#
# ------------ Debugging Settings --------------

# ------------ Other Settings --------------

# ------------ X-Pack Settings (not applicable for OSS build)--------------

#####


cat /etc/logstash/pipelines.yml

root@elk:~# cat /etc/logstash/pipelines.yml
# This file is where you define your pipelines. You can define multiple.
# For more information on multiple pipelines, see the documentation:
#   https://www.elastic.co/guide/en/logstash/current/multiple-pipelines.html

- pipeline.id: main
  path.config: "/etc/logstash/conf.d/*.conf"
root@elk:~# 

root@elk:~# ls -l /etc/logstash/conf.d
total 16
-rw-r--r-- 1 root root 412 Feb 20 18:47 02-beats-input.conf
-rw-r--r-- 1 root root  61 Feb 20 12:35 02-beats-input.conf-Org
-rw-r--r-- 1 root root 433 Feb 21 12:46 30-elasticsearch-output.conf
-rw-r--r-- 1 root root 422 Feb 19 18:22 30-elasticsearch-output.conf-Org
root@elk:~#

OK the logstash.yml looks incomplete / wrong.... that can not be the whole file, we can come back to that later..

What happened when you ran logstash?

Can you please now share

cat /etc/logstash/conf.d/02-beats-input.conf

and

cat /etc/logstash/conf.d/30-elasticsearch-output.conf

It looks like logstash is listening on port 10514 but for beats it should usually be listening on 5044 and not UDP so we need to look at the 02-beats-input.conf

[2021-02-22T09:51:07,908][INFO ][logstash.inputs.udp      ][main][a6c77bc59fe85849e4ded46b15bb40ebe99fe2061050875e32b231be0d1e6c94] Starting UDP listener {:address=>"172.20.111.199:10514"}

The input looks like logstash is trying to listen to rsyslog directly not through beats...

So you need to decide if you want

rsyslog -> filebeat -> logstash -> elasticsearch

or what you appear to have configured, there is no reason you have to use filebeat unless you want to, logstash can directly listen to syslog as well.

rsyslog -> logstash -> elasticsearch

Which one?

I used filebeat, because it use as a filter on kibana.Is it not the way to analysis my logs on kibana?

Yes I planned to configure as this...
" rsyslog -> filebeat -> logstash -> elasticsearch"

[*** Before integrate the rsyslog,kibana got te graphs.]

root@elk:~# cat /etc/logstash/conf.d/02-beats-input.conf
# This input block will listen on port 10514 for logs to come in.
# host should be an IP on the Logstash server.
# codec => "json" indicates that we expect the lines we're receiving to be in JSON format
# type => "rsyslog" is an optional identifier to help identify messaging streams in the pipeline.
input {
  udp {
    host => "172.20.111.199"
    port => 10514
    codec => "json"
    type => "rsyslog"
  }
}

root@elk:~# 
root@elk:~# cat /etc/logstash/conf.d/30-elasticsearch-output.conf
# This is an empty filter block.  You can later add other filters here to further process
# your log lines
filter { }
# This output block will send all events of type "rsyslog" to Elasticsearch at the configured
# host and port into daily indices of the pattern, "rsyslog-YYYY.MM.DD"
output {
  if [type] == "rsyslog" {
    elasticsearch {
      hosts => [ "172.20.111.199:9200" ]
      index => "rsyslog-%{+YYYY.MM.dd}"
    }
  }
}
root@elk:~#

If you want to use Filebeat

Your Filebeat should use the syslog input which should be listening to the syslog server on the correct IP and correct port.

and the Logstash output it should point to the Logstash server at port 5044

/etc/logstash/conf.d/02-beats-input.conf

Should look like.

input {
  beats {
    port => 5044
    type => rsyslog
  }
}

Rsyslog -> Filebeat syslog input (syslog server and port)/ Filebeat Logstash output (Logstash server
and port 5044) -> Logstash beats input 5044 / Logstash elasticsearch -> elasticsearch

root@elk:/etc/rsyslog.d# pwd
/etc/rsyslog.d
root@elk:/etc/rsyslog.d# ls
01-json-template.conf  20-ufw.conf  21-cloudinit.conf  50-default.conf  60-output.conf
root@elk:/etc/rsyslog.d# 

root@elk:/etc/rsyslog.d# cat 60-output.conf

*.*                         @172.20.111.199:10514;json-template


root@elk:/etc/rsyslog.d# cd ../filebeat/
root@elk:/etc/filebeat# ls
fields.yml  filebeat.reference.yml  filebeat.yml  filebeat.yml_2021.02.20.11.30.47  filebeat.yml-ORG  modules.d



root@elk:/etc/filebeat# cat filebeat.yml


# ============================== Filebeat inputs ===============================

filebeat.inputs:

- type: log

  
    - /var/log/*.log
  

# filestream is an experimental input. It is going to replace log input in the future.
- type: filestream


  # Paths that should be crawled and fetched. Glob based paths.
  paths:
    - /var/log/*.log
  

# ============================== Filebeat modules ==============================

filebeat.config.modules:
  # Glob pattern for configuration loading
  path: ${path.config}/modules.d/*.yml

  # Set to true to enable config reloading
  reload.enabled: false

  
# ======================= Elasticsearch template setting =======================

setup.template.settings:
  index.number_of_shards: 1


# ================================== General ===================================



# ================================= Dashboards =================================

# =================================== Kibana ===================================


setup.kibana:

  

# =============================== Elastic Cloud ================================



# ================================== Outputs ===================================

# Configure what output to use when sending the data collected by the beat.

# ---------------------------- Elasticsearch Output ----------------------------


# ------------------------------ Logstash Output -------------------------------
output.logstash:
  # The Logstash hosts
      hosts: ["172.20.111.199:5044"]



# ================================= Processors =================================
processors:
  - add_host_metadata:
      when.not.contains.tags: forwarded
  - add_cloud_metadata: ~
  - add_docker_metadata: ~
  - add_kubernetes_metadata: ~

# ================================== Logging ===================================



# ============================= X-Pack Monitoring ==============================




# ============================== Instrumentation ===============================




# ================================= Migration ==================================

# This allows to enable 6.7 migration aliases
#migration.6_to_7.enabled: true

Is this configuration correct ?
In 60-output.conf will ships logs to logstash on port 10514?
And is there any changers on filebeat.yml ?

I am not a rsyslog expert.

Again I thought you want to ship to rsyslog to Filebeat first so why do you keep saying ship to Logstash.

I think you need to slow down and decide what you want to do pretty much everything you need I gave put in these posts.

Personally I would start with simple

rsyslog -> logstash -> elasticsearch

which is basically what you had working in the very beginning. Filebeat adds another layer which seems to be confusing you. I am not sure where you downloaded all these configs from but I would go back to a working configuration and iterate from there.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.