Hello! I'm testing Filebeat when Logstash is not available, but the filebeat metrics I got from HTTP endpoint don't match what I configured. There is no documentations to describe fields in the metrics, so I'm not sure if I undertood each term correctly. Let me know if I misunderstood any.
Setup:
version: Logstash 6.5.4, Filebeat 6.5.4.
Logstash is not running, so output is blocked for filebeat.
- The queue.mem.events = 64, but pipeline.active = 85, pipiline.total=171. Should the total be at most 64? Didn't filebeat stop reading log files when the internal memory is full?
 - Why doesn filebeat.events.done=2? The output is blocked. There shouldn't be any events done.
 - What's the difference between active and published events?
 - When I query internal metrics from HTTP endpoint, does it run a one-time script to get the metrics or is it stored somewhere in the server? If it is stored on a permenant place, is it space efficient? Is the metrics calculated as delta value based on the last metrics query period or entire running state since filebeat is run?
 - Because filebeat is installed on application server, is there any ways to get internal metrics without enabling HTTP?
 - When the output is blocked, will fileabeat cause a lot of CPU usage?
 
I couldn't find much documents related to my questions above. Any help or explaination will be appreciated! Thanks!
# enable http endpoint to monitor internal state of filebeat
http.enabled: true
# internal queue
queue:
  mem:
    events: 64
    flush.min_events: 2
    flush.timeout: 5s
filebeat.inputs:
- type: log
  exclude_files: ['\.gz$']
  close_renamed: true
  clean_removed: true
  harvester_limit: 10
  scan_frequency: 20s
  # default close_inactive: 5s, will open a new file handler when a file is modified
  # clean_removed is enbaled by default
  paths:
      - /logs
  multiline.pattern: 'START:'
  multiline.negate: true
  multiline.match: after
  multiline.flush_pattern: 'END:'
processors:
  - drop_event:
      when:
        not:
           contains:
              message: "START"
  "filebeat": {
    "events": {
      "active": 169,
      "added": 171,
      "done": 2
    },
    "harvester": {
      "closed": 0,
      "open_files": 1,
      "running": 1,
      "skipped": 0,
      "started": 1
    },
    "input": {
      "log": {
        "files": {
          "renamed": 0,
          "truncated": 0
        }
      }
    }
  },
  "libbeat": {
    "config": {
      "module": {
        "running": 0,
        "starts": 0,
        "stops": 0
      },
      "reloads": 0
    },
    "output": {
      "events": {
        "acked": 0,
        "active": 0,
        "batches": 0,
        "dropped": 0,
        "duplicates": 0,
        "failed": 0,
        "total": 0
      },
      "read": {
        "bytes": 0,
        "errors": 0
      },
      "type": "logstash",
      "write": {
        "bytes": 0,
        "errors": 0
      }
    },
    "pipeline": {
      "clients": 1,
      "events": {
        "active": 85,
        "dropped": 0,
        "failed": 0,
        "filtered": 86,
        "published": 84,
        "retry": 2,
        "total": 171
      },
      "queue": {
        "acked": 0
      }
    }
  },