Monitoring.ui.logs.index not working on docker?

Hi to all,
I've setup a working Kibana using docker-compose and I use journalbeat to collect elasticsearch logs of my docker container (that use journald as logging option). I've tried to change monitoring.ui.logs.index to journalbeat-* but still no log is shown in the Stack monitoring section.
I've used MONITORING_UI_LOGS_INDEX: "journalbeat-*" in the environment section of the service to set that property. Could it be that the docker helper does not pass it correctly to the container?

The logs produced by filebeat and journalbeat are almost the same (I set up some processor in the journalbeat to populate elasticsearch.* fields correctly (and fileset.name or input.log as well)..so it is not clear at all to me why Kibana does not show any elasticsearch logs.

Any help is appreciated and thanks in advance,
Flavio

@Flavio_Pompermaier Sorry to hear that.

This is the query we use to logs:

GET journalbeat-*/_search
{
  "size": 0,
  "sort": {
    "@timestamp": {
      "order": "desc",
      "unmapped_type": "long"
    }
  },
  "query": {
    "bool": {
      "filter": [
        {
          "term": {
            "service.type": "elasticsearch"
          }
        }
      ]
    }
  },
  "aggs": {
    "types": {
      "terms": {
        "field": "event.dataset"
      },
      "aggs": {
        "levels": {
          "terms": {
            "field": "log.level"
          }
        }
      }
    }
  }
}

Can you please run this in Dev Tools and post the result here. Also, can you please send my one raw document from journalbeat-* I want to make sure it abides by "service.type": "elasticsearch" which is in the query

Unfortunately I've reverted to file logging of my ES container to send its logs to the cluster because I saw that there is a lot of logic inside the filbeat.elasticsearch-xpack module and I want them to be handled correctly. In an ideal world there would be a journalbeat.elasticsearch-xpack module that replicates all the logic contained in the filbeat one. Before switching back to file logging this was the processor section of my journalbeat (I think the query you suggested was returning a non empty response when I had data in the journalbeat index):

processors:
  - add_host_metadata: ~
  - add_cloud_metadata: ~
  - add_docker_metadata: ~
  - if:
      regexp:
        journald.custom.image_name: "^docker.elastic.co/elasticsearch/elasticsearch.*"
    then:
      - decode_json_fields:
          fields:
            - message
          target: elasticsearch
          add_error_key: true
          overwrite_keys: true
      - drop_fields:
          fields:
            - message
          ignore_missing: true
      - rename:
          fields:
          - from: elasticsearch.message
            to: message
          - from: elasticsearch.type
            to: fileset.name
          - from: elasticsearch.level
            to: log.level
          - from: elasticsearch.service.type
            to: service.type
          fail_on_error: false
          ignore_missing: true
      - add_fields:
          target: event
          fields:
            module: elasticsearch
            service.type: elasticsearch
      - add_fields:
          target: input
          fields:
            type: "log"
      - script:
          lang: javascript
          id: add_event_dataset
          source: >
            function process(event) {
              var value = event.Get("fileset.name");
              event.Put("event.dataset","elasticsarch." + value);
            }
      - if:
          not:
            has_fields:
              - service.type
        then:
          - add_fields:
              target: service
              fields:
                type: elasticsearch

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.