I am working with an application that writes to a log file. Filebeat uses the Kafka module to send the log data to a central logging server. I am testing the process by appending to the log file with echo
.
This command writes data to the log index:
echo 2024-05-17 21:35:50,394 - daemon - INFO: - TEST LOG MESSAGE >> my_daemon.log
This command does not write data to the log index.
echo 2024-05-17 21:35:50,394 - daemon - INFO - TEST LOG MESSAGE >> my_daemon.log
Notice that the second command differs from the first only by omitting a colon after "INFO". This suggests to me that something is rejecting the message based on format.
Where should I look to find what is validating and rejecting the message?
I am using Filebeat 7.12.1
# filebeat version
filebeat version 7.12.1 (amd64), libbeat 7.12.1 [651a2ad1225f3d4420a22eba847de385b71f711d built 2021-04-20 20:58:32 +0000 UTC]
Here is my filebeat.yml
file.
filebeat.inputs:
- type: log
paths:
- /opt/my/path/api/my_web.log
- /opt/my/path/api/my_daemon.log
- /opt/my/path/api/my_daemon.error.log
filebeat.config.modules:
# Glob pattern for configuration loading
path: ${path.config}/modules.d
reload.enabled: false
setup.template.settings:
index.number_of_shards: 3
fields: {app_name: "Application Name",app_campus: "US",app_env: "PROD",app_client: "MyApp"}
output.kafka:
enabled: true
version: '0.10.0.1'
hosts: ["server01.domain.com:9092", "server02.domain.com:9092", "server03.domain.com:9092"]
topic: 'MyTopic'
partition.round_robin:
reachable_only: true
required_acks: 1
compression: gzip
max_message_bytes: 1000000000
This is my kafka.yml
file.
# Module: kafka
# Docs: https://www.elastic.co/guide/en/beats/filebeat/7.x/filebeat-module-kafka.html
- module: kafka
# All logs
log:
enabled: true
# Set custom paths for Kafka. If left empty,
# Filebeat will look under /opt.
#var.kafka_home:
# Set custom paths for the log files. If left empty,
# Filebeat will choose the paths depending on your OS.
#var.paths:
It is not clear to me that Filebeat or Kafka is the entity performing validation. This might be a Kibana question.
The central logging cluster is named "central_logging". It is running Elasticsearch 7.17.18.