Filebeat won't load input

filebeat parse my input to logstash and logstash parse it to elastic. but filebeat loading inouts =0.
this is my filebeat.yaml

filebeat.config.modules: 
  path: "${path.config}/modules.d/*.yml"
  reload.enabled: true
  reload.period: 10s
filebeat.inputs: 
  enabled: false
  ignore_older: 456h
  paths: 
    - /var/log/TestLog/*
  type: log
filebeat.registry.path: /var/lib/filebeat/registry/filebeat
monitoring.enabled: false
output.logstash: 
  enabled: true
  hosts: 
    - "192.168.80.20:5044"
setup.kibana: ~
setup.template.settings: 
  index.number_of_shards: 1

this is logstash.conf

input {
  beats {
    port => 5044
    ssl => false
  }
}
filter {
}
output {
  elasticsearch {
    hosts => ["192.168.80.20:9200"]
    manage_template => false
    #index => "%{+YYYY.MM.dd}"
  }
}

this is filebeat log

INFO        [publisher]        pipeline/module.go:97        Beat name: localhost.localdomain
WARN        beater/filebeat.go:152        Filebeat is unable to load the Ingest Node pipelines for the configured modules because the Elasticsearch output is not configured/enabled. If you have already loaded the Ingest Node pipelines or are using Logstash pipelines, you can ignore this warning.
INFO        [monitoring]        log/log.go:118        Starting metrics logging every 30s
INFO        instance/beat.go:422        filebeat start running.
INFO        registrar/registrar.go:145        Loading registrar data from /var/lib/filebeat/registry/filebeat/filebeat/data.json
INFO        registrar/registrar.go:152        States Loaded from registrar: 63
WARN        beater/filebeat.go:368        Filebeat is unable to load the Ingest Node pipelines for the configured modules because the Elasticsearch output is not configured/enabled. If you have already loaded the Ingest Node pipelines or are using Logstash pipelines, you can ignore this warning.
INFO        crawler/crawler.go:72        Loading Inputs: 0
INFO        crawler/crawler.go:106        Loading and starting Inputs completed. Enabled inputs: 0
INFO        cfgfile/reload.go:171        Config reloader started
INFO        log/input.go:148        Configured paths: [/var/log/elasticsearch/gc.log.[0-9]* /var/log/elasticsearch/gc.log]
INFO        log/input.go:148        Configured paths: [/var/log/elasticsearch/*.log /var/log/elasticsearch/*_server.json]
INFO        log/input.go:148        Configured paths: [/var/log/elasticsearch/*_index_search_slowlog.log /var/log/elasticsearch/*_index_indexing_slowlog.log /var/log/elasticsearch/*_index_search_slowlog.json /var/log/elasticsearch/*_index_indexing_slowlog.json]
INFO        log/input.go:148        Configured paths: [/var/log/elasticsearch/*_access.log /var/log/elasticsearch/*_audit.log /var/log/elasticsearch/*_audit.json]
INFO        log/input.go:148        Configured paths: [/var/log/elasticsearch/*_deprecation.log /var/log/elasticsearch/*_deprecation.json]
INFO        input/input.go:114        Starting input of type: log; ID: 15537576637552474368
INFO        input/input.go:114        Starting input of type: log; ID: 14070679154152675563
INFO        input/input.go:114        Starting input of type: log; ID: 7953850694515857477
INFO        input/input.go:114        Starting input of type: log; ID: 10720371839583549447
INFO        input/input.go:114        Starting input of type: log; ID: 8161597721645621668
INFO        log/harvester.go:253        Harvester started for file: /var/log/elasticsearch/gc.log
INFO        log/harvester.go:253        Harvester started for file: /var/log/elasticsearch/gc.log
INFO        log/input.go:148        Configured paths: [/var/log/logstash/logstash-plain*.log]
INFO        log/input.go:148        Configured paths: [/var/log/logstash/logstash-slowlog-plain*.log]
INFO        input/input.go:114        Starting input of type: log; ID: 17306378383715639109
INFO        input/input.go:114        Starting input of type: log; ID: 14725834876846155099
INFO        log/input.go:148        Configured paths: [/var/log/auth.log* /var/log/secure*]
INFO        log/input.go:148        Configured paths: [/var/log/messages* /var/log/syslog*]
INFO        input/input.go:114        Starting input of type: log; ID: 14797590234914819083
INFO        input/input.go:114        Starting input of type: log; ID: 16974178264304869863
INFO        log/harvester.go:253        Harvester started for file: /var/log/secure
INFO        log/harvester.go:253        Harvester started for file: /var/log/messages
INFO        pipeline/output.go:95        Connecting to backoff(async(tcp://192.168.80.20:5044))
INFO        pipeline/output.go:105        Connection to backoff(async(tcp://192.168.80.20:5044)) established
INFO        [monitoring]        log/log.go:145        Non-zero metrics in the last 30s        {"monitoring": {"metrics": {"beat":{"cpu":{"system":{"ticks":130,"time":{"ms":139}},"total":{"ticks":400,"time":{"ms":413},"value":400},"user":{"ticks":270,"time":{"ms":274}}},"handles":{"limit":{"hard":4096,"soft":1024},"open":10},"info":{"ephemeral_id":"131df6f7-e19b-431c-9c51-e3bf6178db06","uptime":{"ms":30437}},"memstats":{"gc_next":8720128,"memory_alloc":6310384,"memory_total":44332168,"rss":28749824},"runtime":{"goroutines":90}},"filebeat":{"events":{"active":8,"added":407,"done":399},"harvester":{"open_files":4,"running":4,"started":4}},"libbeat":{"config":{"module":{"running":0},"reloads":2},"output":{"events":{"acked":331,"batches":18,"total":331},"read":{"bytes":108},"type":"logstash","write":{"bytes":31521}},"pipeline":{"clients":9,"events":{"active":8,"filtered":68,"published":339,"retry":175,"total":407},"queue":{"acked":331}}},"registrar":{"states":{"current":63,"update":399},"writes":{"success":86,"total":86}},"system":{"cpu":{"cores":2},"load":{"1":0.77,"15":0.73,"5":0.96,"norm":{"1":0.385,"15":0.365,"5":0.48}}}}}}

this is logstash log

[INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"7.3.2"}
[INFO ][org.reflections.Reflections] Reflections took 40 ms to scan 1 urls, producing 19 keys and 39 values
[INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://192.168.80.20:9200/]}}
[WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://192.168.80.20:9200/"}
[INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>7}
[WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>7}
[INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//192.168.80.20:9200"]}
[WARN ][org.logstash.instrument.metrics.gauge.LazyDelegatingGauge] A gauge metric of an unknown type (org.jruby.RubyArray) has been create for key: cluster_uuids. This may result in invalid serialization.  It is recommended to log an issue to the responsible developer/development team.
[INFO ][logstash.javapipeline    ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>2, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>250, :thread=>"#<Thread:0x5fd9f5d1 run>"}
[INFO ][logstash.inputs.beats    ] Beats inputs: Starting input listener {:address=>"0.0.0.0:5044"}
[INFO ][logstash.javapipeline    ] Pipeline started {"pipeline.id"=>"main"}
[INFO ][org.logstash.beats.Server] Starting server on port: 5044
[INFO ][logstash.agent           ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.