Send logs from filebeat to elasticsearch

I am trying to send logs from filebeat to elasticsearch. Here is the filbeat.yml

filebeat.inputs:
- type: filestream
  id: my-filestream-id
  enabled: true
  paths:
    - C:\ProgramData\sample_logs\sample.log
- type: log
  enabled: true
  paths:
    - C:\ProgramData\sample_logs\sample.log

filebeat.config.modules:
  # Glob pattern for configuration loading
  path: ${path.config}/modules.d/*.yml

output.elasticsearch:
  # Array of hosts to connect to.
  hosts: ["elastic_ip:9200"]

  # Protocol - either `http` (default) or `https`.
  #protocol: "https"

  # Authentication credentials - either API key or username/password.
  #api_key: "id:api_key"
  username: "elastic"
  password: "pwd"

Have enabled elasticsearch module.
The command
.\filebeat -e -c "C:\Program Files\Filebeat\filebeat.yml" test output
gives connection ok result.
.\filebeat -e -c "C:\Program Files\Filebeat\filebeat.yml" -d "publish"
Displays alot of entries on the console.

on kibana, When i navigate to discover tab i am getting an option of creating data view for index pattern filebeat-8.7.0. So the index has been created. But there are no logs on the dashboard

Also there is no entry of filebeat-8.7.0 in index management tab.
Why are there no entries coming? the path configured has log data.

If the cluster is secured this should be https, so try comment out that line.

Also check the Filebeat logs as it may contain clues and please capture and post what it output in the console.

I've disabled https for the cluster. There are no traces in the filebeat logs. It's very big and difficult to post in here

From Kibana Dev Tools run

GET _cat/indices/?

And show the output

yellow open .ds-filebeat-8.7.0-2023.04.24-000001 f4117_dZSOGMLf_Ivl5CJw 1 1 0 0 225b 225b

Here's the output

Yup no data...

Can you share the filebeat logs.

Also filebeat will only read the file once... So if it read it already it not read it again.

You will need to clean out the data registry.

This is filebeat.yml

filebeat.inputs:

- type: filestream
  id: my-filestream-id

  enabled: true

  # Paths that should be crawled and fetched. Glob based paths.
  paths:
    - C:\ProgramData\sample_logs\sample.log

- type: log

  enabled: true
  paths:
    - C:\ProgramData\sample_logs\sample.log

- type: syslog
  enabled: false
  logging:
  level: info
  to_files: true
  to_syslog: false

output.elasticsearch:
  # Array of hosts to connect to.
  hosts: ["elastic_ip:9200"]

  # Protocol - either `http` (default) or `https`.
  #protocol: "https"

  # Authentication credentials - either API key or username/password.
  #api_key: "id:api_key"
  username: "elastic"
  password: "pwd"

This is the log file C:\ProgramData\sample_logs\sample.log

192.168.2.20 - - [28/Jul/2006:10:27:10 -0300] "GET /cgi-bin/try/ HTTP/1.0" 200 3395
127.0.0.1 - - [28/Jul/2006:10:22:04 -0300] "GET / HTTP/1.0" 200 2216
192.168.2.20 - - [28/Jul/2006:10:27:10 -0300] "GET /cgi-bin/try/ HTTP/1.0" 200 395
127.0.0.1 - - [28/Jul/2006:10:22:04 -0300] "GET / HTTP/1.0" 200 221

If the file has been read once i should be able to see it in the logs right? But there is no data at all

Hey, I received the logs. I was wrong with the file path. The file was saved as sample.log.txt and i was specifying sample.log
Thank you so much for the quick response

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.