Elastic-Stack7.17 on ubuntu 20.4LTS, still not indexing

Hello Bro,
I have followed up this link How To Install Elasticsearch, Logstash, and Kibana (Elastic Stack) on Ubuntu 20.04 | DigitalOcean
to install my elc-stack, and everything went as document explains.
But, It still doesn't display "index" any logs received.
I send the logs to port "5044" using solarwindw sending test syslog message, and I receive the log in the tcpdump.
what's wrong or missing in this setup.
and thank you all so muc for this platform.

I wanted to add this, may be it help:
di@elk-7:~$ sudo systemctl status logstash.service
[sudo] password for di:
● logstash.service - logstash
Loaded: loaded (/etc/systemd/system/logstash.service; enabled; vendor preset: enabled)
Active: active (running) since Sun 2022-02-06 09:52:00 EET; 47min ago
Main PID: 939 (java)
Tasks: 57 (limit: 9401)
Memory: 810.1M
CGroup: /system.slice/logstash.service
└─939 /usr/share/logstash/jdk/bin/java -Xms1g -Xmx1g -XX:+UseConcMarkSweepGC -XX:CMSInitiatingOccupancyFraction>

Feb 06 09:57:28 elk-7 logstash[939]: [2022-02-06T09:57:28,433][INFO ][logstash.outputs.Elasticsearch][main] Failed to perfor>
Feb 06 09:57:28 elk-7 logstash[939]: [2022-02-06T09:57:28,434][WARN ][logstash.outputs.Elasticsearch][main] Attempted to res>
Feb 06 09:57:28 elk-7 logstash[939]: [2022-02-06T09:57:28,452][INFO ][logstash.outputs.Elasticsearch][main] Failed to perfor>
Feb 06 09:57:28 elk-7 logstash[939]: [2022-02-06T09:57:28,453][WARN ][logstash.outputs.Elasticsearch][main] Attempted to res>
Feb 06 09:57:33 elk-7 logstash[939]: [2022-02-06T09:57:33,686][WARN ][logstash.outputs.Elasticsearch][main] Restored connect>
Feb 06 09:57:33 elk-7 logstash[939]: [2022-02-06T09:57:33,689][WARN ][logstash.outputs.Elasticsearch][main] Restored connect>
Feb 06 09:57:33 elk-7 logstash[939]: [2022-02-06T09:57:33,944][INFO ][logstash.outputs.Elasticsearch][main] Elasticsearch ve>
Feb 06 09:57:33 elk-7 logstash[939]: [2022-02-06T09:57:33,946][INFO ][logstash.outputs.Elasticsearch][main] Elasticsearch ve>
Feb 06 09:57:33 elk-7 logstash[939]: [2022-02-06T09:57:33,954][WARN ][logstash.outputs.Elasticsearch][main] Detected a 6.x a>
Feb 06 09:57:33 elk-7 logstash[939]: [2022-02-06T09:57:33,964][WARN ][logstash.outputs.Elasticsearch][main] Detected a 6.x a>
lines 1-19/19 (END)

Hi @Mohammad_Ramadan_Abd

Could you share your beats input plugin?

Seems your output to Elasticsearch is not correct confugred?

Best regards
@fgjensen

1 Like

---------------Thanks for the response--------------------
I also wanted to add, while continuing reading, I found that currently I am receiving the linux system logs, which is the "System" module enabled",I have my own enabled ports that supposed to be listening to incomming logs from different system, I need to have them up receiving.

di@elk-7:~$ sudo cat /etc/logstash/conf.d/30-Elasticsearch-output.conf
output {
if [@metadata][pipeline] {
Elasticsearch {
hosts => ["localhost:9200"]
manage_template => false
index => "%{[@metadata][beat]}-%{[@metadata][version]}-%{+YYYY.MM.dd}"
pipeline => "%{[@metadata][pipeline]}"
}
} else {
Elasticsearch {
hosts => ["localhost:9200"]
manage_template => false
index => "%{[@metadata][beat]}-%{[@metadata][version]}-%{+YYYY.MM.dd}"
}
}
}
--------------------------------------Inputs file too ----------------------------
di@elk-7:~$ sudo cat /etc/logstash/conf.d/02-beats-input.conf
input {
beats {
port => 5044
}
udp {
port => 5514
type => "syslog"
}
udp {
port => 5515
type => "WINDOWS2"
}
udp {
port => 5516
type => "WINDOWS1"
}
udp {
port => 5517
type => "VMware"
}
tcp {
type => "Other"
port => 5544
}
}
--------------------------------------My own syslog filter-----------------------------
di@elk-7:~$ sudo cat /etc/logstash/conf.d/10-syslog-filter.conf
filter {
if [type] == "syslog" {
grok {
match => { "message" => "%{SYSLOGTIMESTAMP:syslog_timestamp}?%%{CISCO_REASON:facility}-%{INT:severity_level}-%{CISCO_REASON:facility_mnemonic}: %{GREEDYDATA:syslog_message}" }
add_field => [ "received_at", "%{@timestamp}" ]
add_field => [ "received_from", "%{host}" ]
}
syslog_pri { }
date {
match => [ "syslog_timestamp", "MMM d HH:mm:ss", "MMM dd HH:mm:ss" ]
}
}
}
--------------------------TIA--------------------------

Hi @Mohammad_Ramadan_Abd ;

If I understand the listing the configuration is stored in 3 files? You should keep the input, filter and output sections in 1 files, see the example in the Logstash documentation..

In the output section you have forgotten to specify the protokol, since you are using the Elasticsearch REST api e.g.,

hosts => ["http://localhost:9200"]

Good luck!

Best regards
@fgjensen

Hello, Thankx for your response.
previously, we used to have the input beat that contains the ports that wil listen to "udp ports", and in a filter file, every port name added with it;s required indexing schemam and finally the output fule that contains the Elasticsearch db connection.
I have checked the link you've snt, it's talking about this new version as inout and output in the same file.
I am reading the link you've provided, nd following it and will get back here.

I have tried the link you've provided but still no luck.
I got your point of keeping the 3 elements "input, filter, and output" in the same file but still couldn;t receive any logs from outside the server system.
kindly provide me with example based on udp port.
say: udp 5514 and called cisco_system
and i will try it in my setup.

Can you share your configuration files in the order they are in the folder using the Preformatted text of the forum? it is pretty hard to see the configurations without the correcting format.

How many output files do you have? Having many different inputs and outputs in the same pipeline can be confusing some times and hard to troubleshoot, the best approach would be to isolate the pipelines using multilpe pipelines with the pipelines.yml file.

Also, you said that you are sending syslog data to port 5044, but in your configuration the port 5044 is used by the beats input, if you sending anything else that isn't come from a beats collector, the message will be dropped.

Are you using the correct port?

1 Like

Sorry for the late response as i am having tons work.
Regarding the configuration, i will reply again with it if the above is npt clear enough.
Regarding how many file do i have? Inthink i ambstill confused and attacked to my previous version of elk-5 where we have single beats input files where we open the required network ports and same ports number of filter files according tonthe requied log pattern, and finally single output file point to the elasticseach db.
I have 5 network ports to listen and for the received logs and apply the pattern and save it in elastic DB.
Ibneed a sample of how to use the input file to open a network port,apply pattern and save it in the DB.

Hello Leandro,
I have reviewed my previous setup which obvioudly that elk-5 totally different of this elk-7.
simply, and summary, I have various sevices in my data center that sends syslogs to elk-5 and I am moving now to elk-7.
obviously that how logstach indexs logs is difference which cause my confusion.
kindly, tell me how to receivee logs from devices on udp port such as 5514, and joine it with it's pattern. for example a switch sends syslog udp packets to elk-7 udp port 5514. how to receive the log in elk-7

Hello leandrojmp,
So far I could have my udp listener up, and receiving the logs, saved, and viewed by Kibana.
my issue now I need to apply filter "grok" which I am not able to apply it, I need example now to apply a grok filter to any of these opened udp ports because we have Cisco systems, ms windows systems, other systems, in the below config:
---------------------Config applied----------------
di@dar-elk-7:~$ sudo cat /etc/logstash/conf.d/02-beats-input.conf
[sudo] password for di:
input {
udp {
port => 5514
type => "syslog"
}
udp {
port => 5515
type => "windows2"
}
udp {
port => 5516
type => "windows1"
}
udp {
port => 5517
type => "vmware"
}
tcp {
type => "Other"

port => 5544
}
}
di@dar-elk-7:~$ sudo cat /etc/logstash/conf.d/30-Elasticsearch-output.conf
output {
if [@metadata][pipeline] {
Elasticsearch {
hosts => ["localhost:9200"]
manage_template => false
index => "sylog"
pipeline => "%{[@metadata][pipeline]}"
}
} else {
Elasticsearch {
hosts => ["localhost:9200"]
manage_template => false
index => "syslog-%{+YYYY.MM.dd}"
}
}
}
-----------------------------------------------------------TIA-------------------

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.