Issue reading from multiple config inputs

We are trying to use multiple filebeat.inputs using type log so we can write different topics to kafka hosts. Below are the configs and the logs
#Main Config
output.kafka:
hosts: ["sitkafka342w88m7:9092","sitkafka343w88m7:9092","sitkafka373w86m7:9092","sitkafka374w86m7:9092"]
topic: '%{[pipeline.topic]}'
key: '%{[beat.hostname]}-%{[source]}'
version: '0.11'
partition.round_robin:
reachable_only: false
compression: gzip

filebeat.config.inputs:
enabled: true
path: 'C:\Program Files\Filebeat\inputs*.yml'
reload.enabled: true
reload.period: 10s

logging.level: info

logging.to_files: true

logging.files:
path: 'C:\Program Files\Filebeat\logs'
name: filebeat
keepfiles: 5

#Two other input configs
output.kafka:
hosts: ["sitkafka342w88m7:9092","sitkafka343w88m7:9092","sitkafka373w86m7:9092","sitkafka374w86m7:9092"]
topic: '%{[pipeline.topic]}'
key: '%{[beat.hostname]}-%{[source]}'
version: '0.11'
partition.round_robin:
reachable_only: false
compression: gzip

filebeat.inputs:

  • type: log
    paths:

    • D:\DIT\log\Services*
    • D:\DIT\log\batch.log
    • D:\DIT\log\csservice.log
    • D:\DIT\log\MUCS.LOG
    • D:\DIT\log\SCD.LOG
    • D:\DIT\log\service.log
    • D:\DIT\log\ServiceAgent.log
      scan_frequency: 10s

    fields_under_root: true
    fields:
    host: "server1"
    pipeline:
    topic: "dit-simcorp-logs-raw"
    source: "filebeat"
    ###########################
    output.kafka:
    hosts: ["sitkafka342w88m7:9092","sitkafka343w88m7:9092","sitkafka373w86m7:9092","sitkafka374w86m7:9092"]
    topic: '%{[pipeline.topic]}'
    key: '%{[beat.hostname]}-%{[source]}'
    version: '0.11'
    partition.round_robin:
    reachable_only: false
    compression: gzip

filebeat.inputs:

  • type: log
    paths:

    • D:\DIT\log\Batch*
    • D:\DIT\log\Batch*
      scan_frequency: 10s

    fields_under_root: true
    fields:
    host: "server2"
    pipeline:
    topic: "dit-simcorp-batch-logs-raw"
    source: "filebeat"
    ####################################
    logs
    PS C:\Program Files\Filebeat> .\filebeat.exe -c .\filebeat.yml -e -v -d "*"
    2019-10-29T12:43:32.859-0400 INFO instance/beat.go:607 Home path: [C:\Program Files\Filebeat] Config path: [C:\Program Files\Filebeat] Data path: [C:\Program Files\Filebeat\data] Logs path: [C:\Program Files\Filebeat\logs]
    2019-10-29T12:43:32.861-0400 DEBUG [beat] instance/beat.go:659 Beat metadata path: C:\Program Files\Filebeat\data\meta.json
    2019-10-29T12:43:32.862-0400 INFO instance/beat.go:615 Beat ID: 42135d8b-3615-4694-9b0c-91227d7b3e1f
    2019-10-29T12:43:32.863-0400 DEBUG [seccomp] seccomp/seccomp.go:96 Syscall filtering is only supported on Linux
    2019-10-29T12:43:32.865-0400 INFO [beat] instance/beat.go:903 Beat info {"system_info": {"beat": {"path": {"config": "C:\Program Files\Filebeat", "data": "C:\Program Files\Filebeat\data", "home": "C:\Program Files\Fil
    ebeat", "logs": "C:\Program Files\Filebeat\logs"}, "type": "filebeat", "uuid": "42135d8b-3615-4694-9b0c-91227d7b3e1f"}}}
    2019-10-29T12:43:32.866-0400 INFO [beat] instance/beat.go:912 Build info {"system_info": {"build": {"commit": "f940c36884d3749901a9c99bea5463a6030cdd9c", "libbeat": "7.4.0", "time": "2019-09-27T07:45:42.000Z", "version": "7.4
    .0"}}}
    2019-10-29T12:43:32.866-0400 INFO [beat] instance/beat.go:915 Go runtime info {"system_info": {"go": {"os":"windows","arch":"amd64","max_procs":4,"version":"go1.12.9"}}}
    2019-10-29T12:43:32.873-0400 INFO [beat] instance/beat.go:919 Host info {"system_info": {"host": {"architecture":"x86_64","boot_time":"2019-10-28T17:11:00.53-04:00","name":"iaddit1SCfs01","ip":["fe80::80ea:339:d797:23cf/64",
    "10.152.97.246/23","::1/128","127.0.0.1/8","fe80::5efe:a98:61f6/128"],"kernel_version":"6.3.9600.19464 (winblue_ltsb_escrow.190828-1437)","mac":["00:50:56:9f:90:2f","00:00:00:00:00:00:00:e0"],"os":{"family":"windows","platform":"windows","n
    ame":"Windows Server 2012 R2 Standard","version":"6.3","major":3,"minor":0,"patch":0,"build":"9600.19463"},"timezone":"EDT","timezone_offset_sec":-14400,"id":"da6edb61-d146-4b23-874c-132c38475620"}}}
    2019-10-29T12:43:32.875-0400 INFO [beat] instance/beat.go:948 Process info {"system_info": {"process": {"cwd": "C:\Program Files\Filebeat", "exe": "C:\Program Files\Filebeat\filebeat.exe", "name": "filebeat.exe", "pid": 18
    60, "ppid": 4168, "start_time": "2019-10-29T12:43:32.348-0400"}}}
    2019-10-29T12:43:32.876-0400 INFO instance/beat.go:292 Setup Beat: filebeat; Version: 7.4.0
    2019-10-29T12:43:32.876-0400 DEBUG [beat] instance/beat.go:318 Initializing output plugins
    2019-10-29T12:43:32.876-0400 DEBUG [kafka] kafka/kafka.go:61 initialize kafka output
    2019-10-29T12:43:32.880-0400 DEBUG [publisher] pipeline/consumer.go:137 start pipeline event consumer
    2019-10-29T12:43:32.882-0400 INFO [publisher] pipeline/module.go:97 Beat name: iaddit1SCfs01
    2019-10-29T12:43:32.883-0400 WARN beater/filebeat.go:152 Filebeat is unable to load the Ingest Node pipelines for the configured modules because the Elasticsearch output is not configured/enabled. If you have already loaded the Inges
    t Node pipelines or are using Logstash pipelines, you can ignore this warning.
    2019-10-29T12:43:32.886-0400 INFO instance/beat.go:422 filebeat start running.
    2019-10-29T12:43:32.886-0400 DEBUG [service] service/service_windows.go:72 Windows is interactive: true
    2019-10-29T12:43:32.886-0400 INFO [monitoring] log/log.go:118 Starting metrics logging every 30s

Have no issues when running from only one of the configs.

You can have one output.* in your configuration, even if you are using the same output. Otherwise the options you set are overwritten.
What is it that you are trying to achieve? To me all kafka settings look the same in this format. Could you please format your configuration using </>?

we are pointing some logs to a specific topic and other logs to another topic so there will be two different dashboards for the business to consume. We figured out the issue. We needed to update the other inputs to read without the output.kafka and filebeatbeat.config.inputs: so they look like

  • type: log
    paths:

    • D:\DIT\log\Batch**.log
    • D:\DIT\log\Batch**.txt
      scan_frequency: 30s

    fields_under_root: true

    fields:
    host: "IADDITSCFS01"
    pipeline:
    topic: "dit-simcorp-batch-logs-raw"
    source: "filebeat"

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.