[SOLVED] I can't use winlogbeat with Kibana

Hi everyone, I'm running elasticsearch and Kibana in a Debian10 machine, everything was working perfectly. Then I tried intalling winlogbeat in a Windows10 machine and here comes my problem, I've been searching for many hours for any solution but I couldn't find any.

Here I'll add my config files (note that I'll post only what i uncommented, the rest of the files remains as default).

Kibana file:

    # Kibana is served by a back end server. This setting specifies the port to use.
    server.port: 5601

    # To allow connections from remote users, set this parameter to a non-loopback address.
    server.host: "0.0.0.0"

Elasticsearch file:


    # ----------------------------------- Paths ------------------------------------
    #
    # Path to directory where to store the data (separate multiple locations by comma):
    #
    path.data: /var/lib/elasticsearch
    #
    # Path to log files:
    #
    path.logs: /var/log/elasticsearch
    #
    # ---------------------------------- Network -----------------------------------
    #
    # Set the bind address to a specific IP (IPv4 or IPv6):
    #
    network.host: 192.168.0.1
    #
    # Set a custom port for HTTP:
    #
    http.port: 9200
    #
    # For more information, consult the network module documentation.
    #
    # --------------------------------- Discovery ----------------------------------
    #
    # Pass an initial list of hosts to perform discovery when this node is started:
    # The default list of hosts is ["127.0.0.1", "[::1]"]
    #
    #discovery.seed_hosts: ["host1", "host2"]
    discovery.seed_hosts: ["host1"]
    #
    # Bootstrap the cluster using an initial set of master-eligible nodes:
    #
    #cluster.initial_master_nodes: ["node-1", "node-2"]
    cluster.initial_master_nodes: ["node-1"]
    #

My configuration file of W10 winlogbeat:


    # ======================== Winlogbeat specific options =========================
    winlogbeat.event_logs:
      - name: Application
        ignore_older: 72h

      - name: System

      - name: Security
        processors:
          - script:
              lang: javascript
              id: security
              file: ${path.home}/module/security/config/winlogbeat-security.js

      - name: Microsoft-Windows-Sysmon/Operational
        processors:
          - script:
              lang: javascript
              id: sysmon
              file: ${path.home}/module/sysmon/config/winlogbeat-sysmon.js

      - name: Windows PowerShell
        event_id: 400, 403, 600, 800
        processors:
          - script:
              lang: javascript
              id: powershell
              file: ${path.home}/module/powershell/config/winlogbeat-powershell.js

      - name: Microsoft-Windows-PowerShell/Operational
        event_id: 4103, 4104, 4105, 4106
        processors:
          - script:
              lang: javascript
              id: powershell
              file: ${path.home}/module/powershell/config/winlogbeat-powershell.js

      - name: ForwardedEvents
        tags: [forwarded]
        processors:
          - script:
              when.equals.winlog.channel: Security
              lang: javascript
              id: security
              file: ${path.home}/module/security/config/winlogbeat-security.js
          - script:
              when.equals.winlog.channel: Microsoft-Windows-Sysmon/Operational
              lang: javascript
              id: sysmon
              file: ${path.home}/module/sysmon/config/winlogbeat-sysmon.js
          - script:
              when.equals.winlog.channel: Windows PowerShell
              lang: javascript
              id: powershell
              file: ${path.home}/module/powershell/config/winlogbeat-powershell.js
          - script:
              when.equals.winlog.channel: Microsoft-Windows-PowerShell/Operational
              lang: javascript
              id: powershell
              file: ${path.home}/module/powershell/config/winlogbeat-powershell.js

    # ====================== Elasticsearch template settings =======================

    setup.template.settings:
      index.number_of_shards: 1
      #index.codec: best_compression
      #_source.enabled: false

    # ================================= Dashboards =================================
    # These settings control loading the sample dashboards to the Kibana index. Loading
    # the dashboards is disabled by default and can be enabled either by setting the
    # options here or by using the `setup` command.
    setup.dashboards.enabled: true

    # =================================== Kibana ===================================

    # Starting with Beats version 6.0.0, the dashboards are loaded via the Kibana API.
    # This requires a Kibana endpoint configuration.
    setup.kibana:

      # Kibana Host
      # Scheme and port can be left out and will be set to the default (http and 5601)
      # In case you specify and additional path, the scheme is required: http://localhost:5601/path
      # IPv6 addresses should always be defined as: https://[2001:db8::1]:5601
      host: "192.168.0.1:5601"

      # Kibana Space ID
      # ID of the Kibana Space into which the dashboards should be loaded. By default,
      # the Default Space will be used.
      #space.id:

    # ================================== Outputs ===================================

    # Configure what output to use when sending the data collected by the beat.

    # ---------------------------- Elasticsearch Output ----------------------------
    output.elasticsearch:
      # Array of hosts to connect to.
      hosts: ["192.168.0.1:9200"]

      # Protocol - either `http` (default) or `https`.
      #protocol: "https"

      # Authentication credentials - either API key or username/password.
      #api_key: "id:api_key"
      #username: "elastic"
      #password: "changeme"

    # ================================= Processors =================================
    processors:
      - add_host_metadata:
          when.not.contains.tags: forwarded
      - add_cloud_metadata: ~

If i try:

curl -X GET "192.168.0.1:9200/_cluster/health"

Then I get:

{"cluster_name":"elasticsearch","status":"yellow","timed_out":false,"number_of_nodes":1,"number_of_data_nodes":1,"active_primary_shards":9,"active_shards":9,"relocating_shards":0,"initializing_shards":0,"unassigned_shards":2,"delayed_unassigned_shards":0,"number_of_pending_tasks":0,"number_of_in_flight_fetch":0,"task_max_waiting_in_queue_millis":0,"active_shards_percent_as_number":81.81818181818183}

If I try accessing from W10 to 192.168.0.1:9200


    {
      "name" : "debiansosa",
      "cluster_name" : "elasticsearch",
      "cluster_uuid" : "0jtOoSGcQsapcSoScKlgyg",
      "version" : {
        "number" : "7.10.2",
        "build_flavor" : "default",
        "build_type" : "deb",
        "build_hash" : "747e1cc71def077253878a59143c1f785afa92b9",
        "build_date" : "2021-01-13T00:42:12.435326Z",
        "build_snapshot" : false,
        "lucene_version" : "8.7.0",
        "minimum_wire_compatibility_version" : "6.8.0",
        "minimum_index_compatibility_version" : "6.0.0-beta1"
      },
      "tagline" : "You Know, for Search"
    }

But if I try accessing to Kibana it says:
Kibana server is not ready.

Same happens on the host debian10 machine.

So this is what i get when I try running .\winlogbeat.exe setup


    Overwriting ILM policy is disabled. Set `setup.ilm.overwrite: true` for enabling.

    Index setup finished.
    Loading dashboards (Kibana must be running and reachable)
    Exiting: error connecting to Kibana: fail to get the Kibana version: HTTP GET request to http://192.168.0.1:5601/api/status fails: parsing kibana response: invalid character 'K' looking for beginning of value. Response: Kibana server is not ready yet.
  

Kibana can't start successfully. Please check the logs and post them here to find out what's going wrong.

1 Like

I don't know where I can found Kibana logs but I used tail -n 100 /var/log/syslog and found this:

Feb 10 20:58:30 debiansosa kibana[7375]: {"type":"log","@timestamp":"2021-02-10T19:58:30Z","tags":["error","elasticsearch","data"],"pid":7375,"message":"[ConnectionError]: connect ECONNREFUSED 127.0.0.1:9200"}
Feb 10 20:58:32 debiansosa kibana[7375]: {"type":"log","@timestamp":"2021-02-10T19:58:32Z","tags":["error","elasticsearch","data"],"pid":7375,"message":"[ConnectionError]: connect ECONNREFUSED 127.0.0.1:9200"}
Feb 10 20:58:35 debiansosa kibana[7375]: {"type":"log","@timestamp":"2021-02-10T19:58:35Z","tags":["error","elasticsearch","data"],"pid":7375,"message":"[ConnectionError]: connect ECONNREFUSED 127.0.0.1:9200"}
Feb 10 20:58:36 debiansosa kibana[7375]: {"type":"log","@timestamp":"2021-02-10T19:58:36Z","tags":["info","plugins-system"],"pid":7375,"message":"Stopping all plugins."}
Feb 10 20:58:36 debiansosa systemd[1]: Stopping Kibana...
Feb 10 20:58:36 debiansosa kibana[7375]: {"type":"log","@timestamp":"2021-02-10T19:58:36Z","tags":["info","savedobjects-service"],"pid":7375,"message":"Starting saved objects migrations"}
Feb 10 20:58:36 debiansosa kibana[7375]: {"type":"log","@timestamp":"2021-02-10T19:58:36Z","tags":["warning","savedobjects-service"],"pid":7375,"message":"Unable to connect to Elasticsearch. Error: Given the configuration, the ConnectionPool was not able to find a usable Connection for this request."}
Feb 10 20:58:36 debiansosa kibana[7375]: {"type":"log","@timestamp":"2021-02-10T19:58:36Z","tags":["warning","savedobjects-service"],"pid":7375,"message":"Unable to connect to Elasticsearch. Error: Given the configuration, the ConnectionPool was not able to find a usable Connection for this request."}
Feb 10 20:58:36 debiansosa systemd[1]: kibana.service: Succeeded.
Feb 10 20:58:36 debiansosa systemd[1]: Stopped Kibana.
Feb 10 20:58:36 debiansosa filebeat[5534]: 2021-02-10T20:58:36.696+0100#011ERROR#011[publisher_pipeline_output]#011pipeline/output.go:154#011Failed to connect to backoff(elasticsearch(http://localhost:9200)): Get "http://localhost:9200": dial tcp [::1]:9200: connect: connection refused
Feb 10 20:58:36 debiansosa filebeat[5534]: 2021-02-10T20:58:36.696+0100#011INFO#011[publisher_pipeline_output]#011pipeline/output.go:145#011Attempting to reconnect to backoff(elasticsearch(http://localhost:9200)) with 93 reconnect attempt(s)
Feb 10 20:58:36 debiansosa filebeat[5534]: 2021-02-10T20:58:36.696+0100#011INFO#011[publisher]#011pipeline/retry.go:219#011retryer: send unwait signal to consumer
Feb 10 20:58:36 debiansosa filebeat[5534]: 2021-02-10T20:58:36.696+0100#011INFO#011[publisher]#011pipeline/retry.go:223#011  done
Feb 10 20:58:42 debiansosa systemd[1]: Started Kibana.

If you need more info just please tell me where I can found what you need to help me and I'll post it here.

Ps: Thank you so much for your help.

Chris.

Kibana can not connect to elasticsearch.

In The kibana.yml try setting the elasticsearch host to 192.168.0.1:9200

1 Like

Thanks you so much Stephenb and Flash1293, now it's working, my Kibana.yml file looks like this, just in case someone needs help:

     # Kibana is served by a back end server. This setting specifies the port to use.
        server.port: 5601

    # To allow connections from remote users, set this parameter to a non-loopback address.
        server.host: "0.0.0.0"

    # The URLs of the Elasticsearch instances to use for all your queries.
    elasticsearch.hosts: ["http://192.168.0.1:9200"]