Save search results as a file. Which is continually updated

Hello,
i capture logs from a Docker container according to the following pattern.

routes:
  - multiline+logstash+tcp://logstash.intern.example.com:50000
env:
  - name: SYSLOG_HOSTNAME
    value: homeassistant
  - name: INACTIVITY_TIMEOUT
    value: 1m
  - name: MULTILINE_PATTERN
    value: >-
      (\d\d(\d\d)?[-/]\d\d[-/]\d\d[T
      ]\d\d:\d\d:\d\d)|(^s6-rc:)|(^\[\d\d:\d\d:\d\d\])|(\d\d:\d\d:\d\d\
      -)|(^[TDIWEF]:)
  - name: MULTILINE_MATCH
    value: first
  - name: INCLUDE_CONTAINERS
    value: homeassistant
  - name: LOGSTASH_FIELDS
    value: source=hame01

Here the logstash config:

root@dsme01:~# cat /etc/logstash/conf.d/syslog/syslog_hame01.conf 
input {
  tcp {
    host => "logstash.intern.example.com"
    port => 50000
    codec => json
  }
}
## Add your filters / logstash plugins configuration here
filter {
  if ([source] == "my-home-assistant") {
    if ([docker][name] == "/homeassistant") {
      grok {
        patterns_dir => ["/usr/share/logstash/pipeline/patterns"]
        match => { "message" => "%{LOGLEVEL:log_level}%{SPACE}\(%{GREEDYDATA:log_thread}\)%{SPACE}\[%{LOGGER_NAME:log_name}\]%{SPACE}%{GREEDYDATA:log_message}" }
      }
      mutate {
        gsub => [ "log_message", "\x1B\[([0-9]{1,2}(;[0-9]{1,2})?)?[m|M|K]", "" ]
      }
        if [log_message] =~ /\n/ {
          mutate {
            copy => { "log_message" => "log_trace" }
          }
          mutate {
            gsub => [ "log_message", "(?m)^([^\n]*)$.*", "\1" ]
          }
        }
      } else {
      drop { }
    }
  }
}
output {
  elasticsearch {
    hosts => "elasticsearch.intern.example.com:9200"
    user => "elastic"
    password => "elastic"
    ssl => true
    cacert => "/etc/logstash/config/certs/HarbichCA.crt"
    index => "syslog_hame01-%{+YYYY.MM.dd}"
  }
}

I see the Information in Kibana.
Can I also see the messages in a file on the ELK host in order to search for the contents with another program (fail2ban)?
Greetings from Stefan Harbich

Well, it's easier to search in Kibana-Discovery :slight_smile:

In order to see what LS will send to ES, you can add in the output:
file { path => "/path/syslog_%{+YYYY-MM-dd}.txt" " }

Hello,
Unfortunately that does not work. Here is my adjusted configuration:

    file {
      path => "/var/log/homeassistant/syslog_hame01.log"
    }

and the error in the log:

[2024-02-27T23:00:40,970][ERROR][logstash.agent           ] Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:hame01, :exception=>"LogStash::ConfigurationError", :message=>"Expected one of [ \\t\\r\\n], \"#\", \"=>\" at line 42, column 10 (byte 1127) after output {\n  elasticsearch {\n    hosts => \"elasticsearch.intern.example.com:9200\"\n    user => \"elastic\"\n    password => \"elastic\"\n    ssl => true\n    cacert => \"/etc/logstash/config/certs/HarbichCA.crt\"\n    index => \"syslog_hame01-%{+YYYY.MM.dd}\"\n    file ", :backtrace=>["/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:32:in `compile_imperative'", "org/logstash/execution/AbstractPipelineExt.java:239:in `initialize'", "org/logstash/execution/AbstractPipelineExt.java:173:in `initialize'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:48:in `initialize'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline_action/create.rb:49:in `execute'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:386:in `block in converge_state'"]}

Where is the mistake?

Greetings from Stefan Harbich

at line 42, column 10 <-- You have an issue here.
Try with this:

output {
  elasticsearch {
    hosts => "elasticsearch.intern.example.com:9200"
    user => "elastic"
    password => "pass"
    ssl => true
    cacert => "/etc/logstash/config/certs/HarbichCA.crt"
    index => "syslog_hame01-%{+YYYY.MM.dd}"
  }

    file { path => "/var/log/homeassistant/syslog_hame01.log"  }
}

It just doesn't work. I checked the permissions. From my point of view everything is fine. I even set the permission to 777. The following users are authorized to write to the directory (elasticsearch, logstash or root). I created the file. Nothing helps. it just doesn't work. no log file is written. Do I have to install a plugin so that the file can be created? What else can I do?

No, you don't have to install file plugin, it's already included in LS installation.

Error: "line 42, column 10" means your .conf file is not OK, most likely brackets are no closed.

Again, if you run LS as servicectl, your user is "logstash:logstash" and you should change the ownership. If you ran as a process from the command line, then that user needs write permissions.

Basically, "file { path ..." will write the same data which goes to ES to local disk.

I've tried everything. Permissions checked. Tried all variants. No file is created under "/var/log/homeassistant/". I just don't know why????
Here my config:

root@dsme01:~# cat /etc/logstash/conf.d/syslog/syslog_hame01.conf 
input {
  tcp {
    host => "logstash.intern.example.com"
    port => 50000
    codec => json
  }
}
## Add your filters / logstash plugins configuration here
filter {
  if ([source] == "my-home-assistant") {
    if ([docker][name] == "/homeassistant") {
      grok {
        patterns_dir => ["/usr/share/logstash/pipeline/patterns"]
        match => { "message" => "%{LOGLEVEL:log_level}%{SPACE}\(%{GREEDYDATA:log_thread}\)%{SPACE}\[%{LOGGER_NAME:log_name}\]%{SPACE}%{GREEDYDATA:log_message}" }
      }
      mutate {
        gsub => [ "log_message", "\x1B\[([0-9]{1,2}(;[0-9]{1,2})?)?[m|M|K]", "" ]
      }
        if [log_message] =~ /\n/ {
          mutate {
            copy => { "log_message" => "log_trace" }
          }
          mutate {
            gsub => [ "log_message", "(?m)^([^\n]*)$.*", "\1" ]
          }
        }
      } else {
      drop { }
    }
  }
}
output {
  elasticsearch {
    hosts => "elasticsearch.intern.example.com:9200"
    user => "elastic"
    password => "elastic"
    ssl => true
    cacert => "/etc/logstash/config/certs/HarbichCA.crt"
    index => "syslog_hame01-%{+YYYY.MM.dd}"
  }
  file { path => "/var/log/homeassistant/syslog_hame01.log" }
}

What error message do you get after you changed the .conf file?

Hello,
i found my error. My "logspout addon" has ended. Therefore no data was provided. Now everything works. Many thanks for the help.
Greetings from Stefan Harbich

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.