Filebeat log file input empty in Kibana


I just started learning ELK stack. I'm trying to send logs via a log file from Filebeat directly to Elastic. Each row in the log file is a json. filebeat.yml file is shown below. I'm running all these services using docker in windows 11 machine. I've been trying to fix this for hours but couldn't find the root cause. I didn't see any errors in filebeat docker container but when Kibana.

  - type: filestream
    enabled: true
    id: fbtestapp
      - C:\Ash\Learn\ELK\elk\demonode\logs\log.json

  - add_host_metadata: ~
  - add_cloud_metadata: ~
  - add_docker_metadata: ~
  - add_kubernetes_metadata: ~

  hosts: ["http://elasticsearch:9200"]
  username: elastic
  password: ${ELASTIC_PASSWORD}

  host: "elasticsearch:5601"

After spending several hours on trying to make it work, I finally understood (I think) that filebeat needs to be run as a docker only when you want to retrieve logs from other docker containers running on the same machine. My goal is to ingest logs from a non-docker application running on a windows machine, so this set up wouldn't work.

I've installed the windows version of filebeat directly with slightly different yaml config. I managed to see few logs (finally!). The caveat though is that I only see 115 logs (out of 1800). I don't understand why the other logs are missing. There is only one file in the folder (log.json) that holds all the 1800 records.

I shared my filebeat.yml config below.


  - type: filestream
    id: test_filebeat_id10
    enabled: true
      - C:\Ash\Learn\ELK\demonode\logs\*
      - ndjson:
        target: ""

I'm running the following two steps after updating the yaml file.
step1) Set up the assets using command ".\filebeat.exe setup -e"
step2) Start the filebeat service using command "Start-Service filebeat"

Each row in the log file is a json. Any help in resolving the issue is much appreciated. Thanks!

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.