Filebeat normal start but no update log to elasticsearch

I installed filebeta on macos localhost,elk installed on docker desktop for mac,start filebeat log normal output,doesn't seem to have any problems,but it is not transmitted to elasticsearch.
filebeat.yml

# ============================== Filebeat inputs ===============================

filebeat.inputs:
- type: log

  enabled: true


  paths:
    - /var/log/*.log
    - /var/lib/docker/containers/*/*.log
    - /Users/wanghaoyang/Desktop/temp/logs/*.log
    #- c:\programdata\elasticsearch\logs\*

  # The regexp Pattern that has to be matched. The example pattern matches all lines starting with [
  #multiline.pattern: ^\[

  # Defines if the pattern set under pattern should be negated or not. Default is false.
  #multiline.negate: false

  # Match can be set to "after" or "before". It is used to define if lines should be append to a pattern
  # that was (not) matched before or after or as long as a pattern is not matched based on negate.
  # Note: After is the equivalent to previous and before is the equivalent to to next in Logstash
  #multiline.match: after

# filestream is an experimental input. It is going to replace log input in the future.
- type: filestream

  # Change to true to enable this input configuration.
  enabled: false

  # Paths that should be crawled and fetched. Glob based paths.
  paths:
    - /var/log/*.log
    #- c:\programdata\elasticsearch\logs\*

  # Exclude lines. A list of regular expressions to match. It drops the lines that are
  # matching any regular expression from the list.
  #exclude_lines: ['^DBG']

  # Include lines. A list of regular expressions to match. It exports the lines that are
  # matching any regular expression from the list.
  #include_lines: ['^ERR', '^WARN']

  # Exclude files. A list of regular expressions to match. Filebeat drops the files that
  # are matching any regular expression from the list. By default, no files are dropped.
  #prospector.scanner.exclude_files: ['.gz$']

  # Optional additional fields. These fields can be freely picked
  # to add additional information to the crawled log files for filtering
  #fields:
  #  level: debug
  #  review: 1

# ============================== Filebeat modules ==============================

filebeat.config.modules:
  # Glob pattern for configuration loading
  path: ${path.config}/modules.d/*.yml

  # Set to true to enable config reloading
  reload.enabled: false

  # Period on which files under path should be checked for changes
  #reload.period: 10s

# ======================= Elasticsearch template setting =======================

setup.template.settings:
  index.number_of_shards: 1
  #index.codec: best_compression
  #_source.enabled: false


# ================================== General ===================================

# The name of the shipper that publishes the network data. It can be used to group
# all the transactions sent by a single shipper in the web interface.
#name:

# The tags of the shipper are included in their own field with each
# transaction published.
#tags: ["service-X", "web-tier"]

# Optional fields that you can specify to add additional information to the
# output.
#fields:
#  env: staging

# ================================= Dashboards =================================
# These settings control loading the sample dashboards to the Kibana index. Loading
# the dashboards is disabled by default and can be enabled either by setting the
# options here or by using the `setup` command.
#setup.dashboards.enabled: false

# The URL from where to download the dashboards archive. By default this URL
# has a value which is computed based on the Beat name and version. For released
# versions, this URL points to the dashboard archive on the artifacts.elastic.co
# website.
#setup.dashboards.url:

# =================================== Kibana ===================================

# Starting with Beats version 6.0.0, the dashboards are loaded via the Kibana API.
# This requires a Kibana endpoint configuration.
setup.kibana:

  # Kibana Host
  # Scheme and port can be left out and will be set to the default (http and 5601)
  # In case you specify and additional path, the scheme is required: http://localhost:5601/path
  # IPv6 addresses should always be defined as: https://[2001:db8::1]:5601
  #host: "localhost:5601"

  # Kibana Space ID
  # ID of the Kibana Space into which the dashboards should be loaded. By default,
  # the Default Space will be used.
  #space.id:

# =============================== Elastic Cloud ================================:

# ================================== Outputs ===================================


# ---------------------------- Elasticsearch Output ----------------------------
output.elasticsearch:
  # Array of hosts to connect to.
  hosts: ["localhost:9200"]

  # Protocol - either `http` (default) or `https`.
  #protocol: "https"

  # Authentication credentials - either API key or username/password.
  #api_key: "id:api_key"
  # username: "root"
  # password: ""

starting log

2021-04-20T22:52:54.459+0800	INFO	instance/beat.go:660	Home path: [/Users/wanghaoyang/Downloads/filebeat-7.12.0-darwin-x86_64] Config path: [/Users/wanghaoyang/Downloads/filebeat-7.12.0-darwin-x86_64] Data path: [/Users/wanghaoyang/Downloads/filebeat-7.12.0-darwin-x86_64/data] Logs path: [/Users/wanghaoyang/Downloads/filebeat-7.12.0-darwin-x86_64/logs]
2021-04-20T22:52:54.462+0800	INFO	instance/beat.go:668	Beat ID: 8f3cfbac-4ef2-4e9e-879b-c386c20bb429
2021-04-20T22:52:54.576+0800	INFO	[beat]	instance/beat.go:996	Beat info	{"system_info": {"beat": {"path": {"config": "/Users/wanghaoyang/Downloads/filebeat-7.12.0-darwin-x86_64", "data": "/Users/wanghaoyang/Downloads/filebeat-7.12.0-darwin-x86_64/data", "home": "/Users/wanghaoyang/Downloads/filebeat-7.12.0-darwin-x86_64", "logs": "/Users/wanghaoyang/Downloads/filebeat-7.12.0-darwin-x86_64/logs"}, "type": "filebeat", "uuid": "8f3cfbac-4ef2-4e9e-879b-c386c20bb429"}}}
2021-04-20T22:52:54.577+0800	INFO	[beat]	instance/beat.go:1005	Build info	{"system_info": {"build": {"commit": "08e20483a651ea5ad60115f68ff0e53e6360573a", "libbeat": "7.12.0", "time": "2021-03-18T06:16:51.000Z", "version": "7.12.0"}}}
2021-04-20T22:52:54.577+0800	INFO	[beat]	instance/beat.go:1008	Go runtime info	{"system_info": {"go": {"os":"darwin","arch":"amd64","max_procs":12,"version":"go1.15.8"}}}
2021-04-20T22:52:54.577+0800	INFO	[beat]	instance/beat.go:1012	Host info	{"system_info": {"host": {"architecture":"x86_64","boot_time":"2021-04-20T09:35:41.276798+08:00","name":"wanghaoangdeMBP","ip":["127.0.0.1/8","::1/128","fe80::1/64","fe80::aede:48ff:fe00:1122/64","fe80::1454:900b:995f:c02e/64","192.168.31.53/24","fe80::b461:9ff:fe36:b682/64","fe80::b461:9ff:fe36:b682/64","fe80::b693:ace4:cea0:a8d9/64","fe80::42c7:46e9:4769:4c60/64"],"kernel_version":"20.3.0","mac":["ac:de:48:00:11:22","3e:22:fb:86:59:c0","3c:22:fb:86:59:c0","b6:61:09:36:b6:82","b6:61:09:36:b6:82","82:02:e8:88:c4:05","82:02:e8:88:c4:04","82:02:e8:88:c4:01","82:02:e8:88:c4:00","82:02:e8:88:c4:01"],"os":{"type":"macos","family":"darwin","platform":"darwin","name":"Mac OS X","version":"10.16","major":10,"minor":16,"patch":0,"build":"20D91"},"timezone":"CST","timezone_offset_sec":28800,"id":"C1E7553C-EE47-5BCD-AE7B-C7A9DA962EDE"}}}
2021-04-20T22:52:54.578+0800	INFO	[beat]	instance/beat.go:1041	Process info	{"system_info": {"process": {"cwd": "/Users/wanghaoyang/Downloads/filebeat-7.12.0-darwin-x86_64", "exe": "./filebeat", "name": "filebeat", "pid": 16744, "ppid": 16743, "start_time": "2021-04-20T22:52:53.879+0800"}}}
2021-04-20T22:52:54.578+0800	INFO	instance/beat.go:304	Setup Beat: filebeat; Version: 7.12.0
2021-04-20T22:52:54.578+0800	INFO	[index-management]	idxmgmt/std.go:184	Set output.elasticsearch.index to 'filebeat-7.12.0' as ILM is enabled.
2021-04-20T22:52:54.585+0800	INFO	eslegclient/connection.go:99	elasticsearch url: http://localhost:9200
2021-04-20T22:52:54.591+0800	INFO	[publisher]	pipeline/module.go:113	Beat name: wanghaoangdeMBP
2021-04-20T22:52:54.596+0800	INFO	instance/beat.go:468	filebeat start running.
2021-04-20T22:52:54.596+0800	INFO	[monitoring]	log/log.go:117	Starting metrics logging every 30s
2021-04-20T22:52:54.600+0800	INFO	memlog/store.go:119	Loading data file of '/Users/wanghaoyang/Downloads/filebeat-7.12.0-darwin-x86_64/data/registry/filebeat' succeeded. Active transaction id=4015765
2021-04-20T22:52:54.661+0800	INFO	memlog/store.go:124	Finished loading transaction log file for '/Users/wanghaoyang/Downloads/filebeat-7.12.0-darwin-x86_64/data/registry/filebeat'. Active transaction id=4019704
2021-04-20T22:52:54.665+0800	INFO	[registrar]	registrar/registrar.go:109	States Loaded from registrar: 146
2021-04-20T22:52:54.665+0800	INFO	[crawler]	beater/crawler.go:71	Loading Inputs: 2
2021-04-20T22:52:55.081+0800	INFO	log/input.go:157	Configured paths: [/var/log/*.log /var/lib/docker/containers/*/*.log /Users/wanghaoyang/Desktop/temp/logs/*.log]
2021-04-20T22:52:55.081+0800	INFO	[crawler]	beater/crawler.go:141	Starting input (ID: 1666227476879348107)
2021-04-20T22:52:55.083+0800	INFO	[crawler]	beater/crawler.go:108	Loading and starting Inputs completed. Enabled inputs: 1
2021-04-20T22:52:55.083+0800	INFO	cfgfile/reload.go:164	Config reloader started
2021-04-20T22:52:55.083+0800	INFO	cfgfile/reload.go:224	Loading of config files completed.
2021-04-20T22:52:55.090+0800	INFO	log/harvester.go:302	Harvester started for file: /var/log/system.log
2021-04-20T22:52:57.473+0800	INFO	[add_cloud_metadata]	add_cloud_metadata/add_cloud_metadata.go:101	add_cloud_metadata: hosting provider type not detected.
2021-04-20T22:52:58.480+0800	INFO	[publisher_pipeline_output]	pipeline/output.go:143	Connecting to backoff(elasticsearch(http://localhost:9200))
2021-04-20T22:52:58.478+0800	INFO	[publisher]	pipeline/retry.go:219	retryer: send unwait signal to consumer
2021-04-20T22:52:58.481+0800	INFO	[publisher]	pipeline/retry.go:223	  done
2021-04-20T22:52:58.490+0800	INFO	[esclientleg]	eslegclient/connection.go:314	Attempting to connect to Elasticsearch version 7.12.0
2021-04-20T22:52:58.511+0800	INFO	[license]	licenser/es_callback.go:51	Elasticsearch license: Basic
2021-04-20T22:52:58.519+0800	INFO	[esclientleg]	eslegclient/connection.go:314	Attempting to connect to Elasticsearch version 7.12.0
2021-04-20T22:52:58.543+0800	INFO	[index-management]	idxmgmt/std.go:261	Auto ILM enable success.
2021-04-20T22:52:58.549+0800	INFO	[index-management.ilm]	ilm/std.go:139	do not generate ilm policy: exists=true, overwrite=false
2021-04-20T22:52:58.550+0800	INFO	[index-management]	idxmgmt/std.go:274	ILM policy successfully loaded.
2021-04-20T22:52:58.550+0800	INFO	[index-management]	idxmgmt/std.go:407	Set setup.template.name to '{filebeat-7.12.0 {now/d}-000001}' as ILM is enabled.
2021-04-20T22:52:58.550+0800	INFO	[index-management]	idxmgmt/std.go:412	Set setup.template.pattern to 'filebeat-7.12.0-*' as ILM is enabled.
2021-04-20T22:52:58.550+0800	INFO	[index-management]	idxmgmt/std.go:446	Set settings.index.lifecycle.rollover_alias in template to {filebeat-7.12.0 {now/d}-000001} as ILM is enabled.
2021-04-20T22:52:58.550+0800	INFO	[index-management]	idxmgmt/std.go:450	Set settings.index.lifecycle.name in template to {filebeat {"policy":{"phases":{"hot":{"actions":{"rollover":{"max_age":"30d","max_size":"50gb"}}}}}}} as ILM is enabled.
2021-04-20T22:52:58.559+0800	INFO	template/load.go:97	Template filebeat-7.12.0 already exists and will not be overwritten.
2021-04-20T22:52:58.559+0800	INFO	[index-management]	idxmgmt/std.go:298	Loaded index template.
2021-04-20T22:52:58.566+0800	INFO	[index-management]	idxmgmt/std.go:309	Write alias successfully generated.
2021-04-20T22:52:58.566+0800	INFO	[publisher_pipeline_output]	pipeline/output.go:151	Connection to backoff(elasticsearch(http://localhost:9200)) established
2021-04-20T22:53:24.608+0800	INFO	[monitoring]	log/log.go:144	Non-zero metrics in the last 30s	{"monitoring": {"metrics": {"beat":{"cpu":{"system":{"ticks":362,"time":{"ms":362}},"total":{"ticks":907,"time":{"ms":908},"value":907},"user":{"ticks":545,"time":{"ms":546}}},"info":{"ephemeral_id":"e0247709-782f-420e-ac54-c09ffae6c07c","uptime":{"ms":30491}},"memstats":{"gc_next":20877200,"memory_alloc":13480760,"memory_sys":77022208,"memory_total":129720200,"rss":48410624},"runtime":{"goroutines":36}},"filebeat":{"events":{"added":233,"done":233},"harvester":{"open_files":1,"running":1,"started":1}},"libbeat":{"config":{"module":{"running":0},"reloads":1,"scans":1},"output":{"events":{"acked":86,"active":0,"batches":3,"total":86},"read":{"bytes":4549},"type":"elasticsearch","write":{"bytes":107620}},"pipeline":{"clients":1,"events":{"active":0,"filtered":147,"published":86,"retry":50,"total":233},"queue":{"acked":86}}},"registrar":{"states":{"current":146,"update":233},"writes":{"success":150,"total":150}},"system":{"cpu":{"cores":12},"load":{"1":4.2305,"15":4.7642,"5":4.5625,"norm":{"1":0.3525,"15":0.397,"5":0.3802}}}}}}

I added a log file when it started, and its output was like this, and it didn't look like a problem.

2021-04-20T22:55:15.205+0800	INFO	log/harvester.go:302	Harvester started for file: /Users/wanghaoyang/Desktop/temp/logs/sys-user的副本2.log

please,help me

How do you know it is not transmitted ... The filebeat output logs actually show events sent.

Can you run the following command in the Kibana / Dev Tools and show the results?

GET /_cat/indices/?v

From your logs...

Harvester started for file: /var/log/system.log

Filebeat only started 1 harvester it could not find any files and the other paths you provided either they don't exist or perhaps the user you started file beat does not have access to those.

Are any new logs being written to your system.log, filebeat will only read the once.

Most likely those docker logs need root privilege.

Not sure why I can't find the desktop logs.

But the file beat outlet shows it's only finding one log file,

sorry,I made a stupid mistake,After i execute the
GET /_cat/indices/?v
I find did not create an index for filebeat.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.