Multi-File configuration

My filebeat.yml configuration is as below :

output.logstash:
  hosts: ["localhost:5044"]

filebeat.config.modules:
  path: ${path.config}/modules.d/*.yml

  reload.enabled: false

setup.template.settings:
  index.number_of_shards: 0

filebeat.prospectors:

- input_type: log
  enabled: true
  paths:
    - data\Logs_20180720\AB.log
  fields: {log_type: ab}


- input_type: log
  enabled: true
  paths:
    - data\Logs_20180720\BC.log
  fields: {log_type: bc}

My complete logstash configuration is looking as below :

input {
beats {
	port=>5044
}
}

filter {

if [message] =~ "\tat" {
    grok {
      match => ["message", "^(\tat)"]
      add_tag => ["stacktrace"]
    }
  }
  
  if ([fields][log_type] == "ab") {
    mutate {
      replace => {
        "[type]" => "ab"
      }
    }
  }
  else if ([fields][log_type] == "bc") {
    mutate {
      replace => {
        "[type]" => "bc"
      }
    }
  }
  
  #Grokking Spring Boot's default log format
  grok {

	match => [
		"message",'^%{TIMESTAMP_ISO8601:timestamp} (\[%{DATA:thread}\] )?%{LOGLEVEL:level}%{SPACE}%{JAVACLASS:class}\.%{DATA:method} - %{GREEDYDATA:logMessage}$'
		
	]
	
  }

}

output {
  
   if "_grokparsefailure" in [tags] {
        stdout { codec => rubydebug {metadata => true }}
    }
	if "log" in [tags]{
		if "ERROR" in [level]{
			elasticsearch { hosts => ["localhost:9200"] }
		}
		else if "WARN" in [level]{
			elasticsearch { hosts => ["localhost:9200"] }
		}
		else if "INFO" in [level]{
			elasticsearch { hosts => ["localhost:9200"] }
		}
		else if "FATAL" in [level]{
			elasticsearch { hosts => ["localhost:9200"] }
		}
	}

  elasticsearch {
   	 hosts => ["localhost:9200"]
     index => "raghu-%{type}-%{+YYYY.MM}"
  }
}

There is no error log on console. But nothing is happening the logs are generating but filebeat is silent. kindly suggest.

Below the log on console of filebeat:

2018-08-03T15:11:15.183+0530    INFO    instance/beat.go:492    Home path: [C:\tools\filebeat] Config path: [C:\tools\filebeat] Data path: [C:\tools\filebeat\data] Logs path: [C:\tools\filebeat\logs]
2018-08-03T15:11:15.184+0530    INFO    instance/beat.go:499    Beat UUID: 2e8df794-8fd7-4fd1-a380-67d2ede115c7
2018-08-03T15:11:15.185+0530    INFO    [beat]  instance/beat.go:716    Beat info       {"system_info": {"beat": {"path": {"config": "C:\\tools\\filebeat", "data": "C:\\tools\\filebeat\\data", "home": "C:\\tools\\filebeat", "logs": "C:\\tools\\filebeat\\logs"}, "type": "filebeat", "uuid": "2e8df794-8fd7-4fd1-a380-67d2ede115c7"}}}
2018-08-03T15:11:15.185+0530    INFO    [beat]  instance/beat.go:725    Build info      {"system_info": {"build": {"commit": "ed42bb85e72ae58cc09748dc1825159713e0ffd4", "libbeat": "6.3.1", "time": "2018-06-29T21:09:04.000Z", "version": "6.3.1"}}}
2018-08-03T15:11:15.185+0530    INFO    [beat]  instance/beat.go:728    Go runtime info {"system_info": {"go": {"os":"windows","arch":"amd64","max_procs":4,"version":"go1.9.4"}}}
2018-08-03T15:11:15.218+0530    INFO    instance/beat.go:225    Setup Beat: filebeat; Version: 6.3.1
2018-08-03T15:11:15.219+0530    INFO    pipeline/module.go:81   Beat name: RILITS-HWLTP132
2018-08-03T15:11:15.219+0530    WARN    [cfgwarn]       beater/filebeat.go:61   DEPRECATED: prospectors are deprecated, Use `inputs` instead. Will be removed in version: 7.0.0
2018-08-03T15:11:15.223+0530    INFO    instance/beat.go:315    filebeat start running.
2018-08-03T15:11:15.224+0530    INFO    [monitoring]    log/log.go:97   Starting metrics logging every 30s
2018-08-03T15:11:15.225+0530    INFO    registrar/registrar.go:116      Loading registrar data from C:\tools\filebeat\data\registry
2018-08-03T15:11:15.226+0530    INFO    registrar/registrar.go:127      States Loaded from registrar: 3
2018-08-03T15:11:15.227+0530    WARN    beater/filebeat.go:354  Filebeat is unable to load the Ingest Node pipelines for the configured modules because the Elasticsearch output is not configured/enabled. If you have already loaded the Ingest Node pipelines or are using Logstash pipelines, you can ignore this warning.
2018-08-03T15:11:15.227+0530    INFO    crawler/crawler.go:48   Loading Inputs: 2
2018-08-03T15:11:15.227+0530    WARN    [cfgwarn]       input/config.go:25      DEPRECATED: input_type input config is deprecated. Use type instead. Will be removed in version: 6.0.0
2018-08-03T15:11:15.229+0530    INFO    log/input.go:113        Configured paths: [C:\tools\filebeat\data\Logs_20180720\eis_log_file.log]
2018-08-03T15:11:15.231+0530    INFO    input/input.go:88       Starting input of type: log; ID: 10642447689570774022
2018-08-03T15:11:15.231+0530    WARN    [cfgwarn]       input/config.go:25      DEPRECATED: input_type input config is deprecated. Use type instead. Will be removed in version: 6.0.0
2018-08-03T15:11:15.232+0530    INFO    log/input.go:113        Configured paths: [C:\tools\filebeat\data\Logs_20180720\SA.log]
2018-08-03T15:11:15.232+0530    INFO    input/input.go:88       Starting input of type: log; ID: 15709701256631990576
2018-08-03T15:11:15.233+0530    INFO    crawler/crawler.go:82   Loading and starting Inputs completed. Enabled inputs: 22018-08-03T15:11:15.233+0530    INFO    cfgfile/reload.go:122   Config reloader started
2018-08-03T15:11:15.234+0530    INFO    cfgfile/reload.go:214   Loading of config files completed.

Could you please format your config using </>? Also, please attach debug logs.

Could you also add the output part of your config? Also, please attach the output of ./filebeat -e -d "*".

added please check.

So it looks like you have already sent events from these inputs. Filebeat does not reread messages it has encountered previously. If you delete your data/registry file, all files will be read again and events will be forwarded.
Please, don't delete the file if it's important that there is duplication of logs in the output.

I deleted the registry file, I see only one file is getting logged. Since i wanted to test I gave an output config to a file. How can I give 2 output files in my case where ab.log should log to filebeat-ab.log and the other to filebeat-bc.log. thanks.

You can only write events to a single file. There is no way to separate the input files. So you could try testing it once with the first input file and than the second one.