Not getting data in Elastic & Kibna from logstash

I am trying to upgrade our 5.6 cluster to 7.6 and can certainly use some expertise. I have configured the filebeat with below config

filebeat:

config:
inputs:
path: /etc/filebeat/filebeat.yml
filebeat:
inputs:
- input_type: log
document_type: inf_os_logs_AT
paths:
- /var/log/messages
- /var/log/secure
exclude_files: ['.gz$']
ignore_older: 48h
clean_inactive: 72h
- input_type: log
document_type: inf_cmd_audit
paths:
- /var/log/.logging_history/commands.log
exclude_files: ['.gz$']
ignore_older: 48h
clean_inactive: 72h
output:
logstash:
hosts: ["XXXXXXX:5044","XXXXXXX:5044"]
loadbalance: true
compression_level: 3
worker: 6
logging:
to_files: true
files:
path: /var/log/mybeat
name: beat.log
keepfiles: 7
rotateeverybytes: 104857600 # = 100MB
level: info

and do see logstatsh getting the payload and processing it

[DEBUG] 2020-11-03 07:11:20.810 [defaultEventExecutorGroup-7-2] BeatsHandler - [local: xxxxx:5044, remote: xxxxx:40085] Received a new payload

[DEBUG] 2020-11-03 07:11:20.811 [defaultEventExecutorGroup-7-2] BeatsHandler - [local: xxxxx:5044, remote: xxxxx:40085] Sending a new message for the listener, sequence: 1
[DEBUG] 2020-11-03 07:11:20.814 [defaultEventExecutorGroup-7-2] BeatsHandler - [local: xxxxx:5044, remote: xxxxx:40085] Sending a new message for the listener, sequence: 2
[DEBUG] 2020-11-03 07:11:20.818 [defaultEventExecutorGroup-7-2] BeatsHandler - 91c281a4: batches pending: false
[DEBUG] 2020-11-03 07:11:20.920 [[main]>worker0] grok - Running grok filter {:event=>#LogStash::Event:0x5e012eb2}
[DEBUG] 2020-11-03 07:11:20.921 [[main]>worker0] grok - Event now: {:event=>#LogStash::Event:0x5e012eb2}
[DEBUG] 2020-11-03 07:11:20.922 [[main]>worker0] grok - Running grok filter {:event=>#LogStash::Event:0x5b2306d4}
[DEBUG] 2020-11-03 07:11:20.923 [[main]>worker0] grok - Event now: {:event=>#LogStash::Event:0x5b2306d4}
[DEBUG] 2020-11-03 07:11:21.953 [pool-4-thread-1] jvm - collector name {:name=>"ParNew"}

Logstash filter:

filter {

if [type] =~ "inf_os_logs" or [type] =~ "XXX" or [type] =~ "XXX" or [type] =~ "XXXX" or [type] =~ "XXXX" or [type] =~ "XXXX" or [type] =~ "XXXX" or [type] =~ "XXXX" or [type] =~ "XXXX" or [type] =~ "XXXX" or [type] =~ "XXXX" or [type] =~ "XXX" {
grok {
match => { "message" => "%{SYSLOGTIMESTAMP:syslog_timestamp}" }
}
}
}

I am unable to determine if the index name i expect from filebeat is being passed because we filter logs based on the index name and also i am unable to see any activity in elasticsearch / Kibana which i think may be due to logs getting dropped due to index name.

I have exhausted all possible troubleshooting steps and can use a expert's suggestion.

Checking if anyone has any ideas ?

I figured it out , the index name was sent as filebeat-* and logstash was not processing it because it was expecting a different index name.

Added below along with customer index name in filebeat and got data in kibana

setup.ilm.enabled: false
setup.template.enabled: true
setup.template.name: "inf_os_logs_at-%{[agent.version]}"
setup.template.pattern: "inf_os_logs_at-%{[agent.version]}-*"

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.