Filebeat not sending/error sending data to logstash

Hello, I'm trying to send mysql logs to logstash, initially there was only logstash configuration, but I ran into the fact that filebeat sent messages in encoded form, so I had to redo the filebeat config, here it is:

# ============================== Filebeat inputs ===============================

filebeat.inputs:

# Each - is an input. Most options can be set at the input level, so
# you can use different inputs for various configurations.
# Below are the input specific configurations.

# filestream is an input for collecting log messages from files.
- type: filestream

  # Unique ID among all inputs, an ID is required.
  #id: 55

  # Change to true to enable this input configuration.
  enabled: true

  # Paths that should be crawled and fetched. Glob based paths.
  paths:
    - /home/user/test.log
  parsers:
    - multiline:
        type: pattern
        pattern: '^#%{SPACE}Time:%{SPACE}%{TIMESTAMP_ISO8601:time}'
        negate: true
        match: after

Logstash config:

input {
  beats {
   port => 5044
  }
 }
filter {
    grok {
......
output {
  elasticsearch {
    hosts => ["localhost:9200"]
  }
}

Log example:

# Time: 2022-09-08T01:17:01.998350Z
# User@Host: user[user] @ localhost []  Id: 525243
# Query_time: 0.000001  Lock_time: 0.000000 Rows_sent: 0  Rows_examined: 0
SET timestamp=434231;
# Time: 2022-09-08T02:33:15.9943350Z
# User@Host: user1[user1] @ [127.0.0.1]  Id: 132681
# Query_time: 0.000001  Lock_time: 0.000000 Rows_sent: 0  Rows_examined: 0
SET timestamp=965523;

First problem, sometimes when you run filebeat it just doesn't read the file, an index is created but it's empty
The second is that I gave him a template on how to split the file, I expect to see 2 hits in this case, but he glues these messages into one event
P.s before that, I had a config for logstash with multiline codec and a grok template, and it parsed well

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.