and many files under the config_dir that act as proper prospector config, following the rule:
Full Path to directory with additional prospector configuration files. Each file must end with .yml
These config files must have the full filebeat config part inside, but only
the prospector part is processed. All global options like spool_size are ignored.
The config_dir MUST point to a different directory then where the main filebeat config file is in.
I supposed it could be possible to set a config like that.. so am i wrong in this supposition or there is something bad at syntax level (i guess not because of the resulting error string) ??
I would expect the above to work, but I have to test it locally to confirm. Can you run filebeat with debug output enabled -e -d "*" and paste the output of the setup phase here? That should give us some more information on which files are loaded (or not).
PS: I edited your post to properly show the config.
The following is all the output that comes, nothing more
./filebeat -e -d "*"
2016/02/17 14:57:47.367709 beat.go:135: DBG Initializing output plugins
2016/02/17 14:57:47.367752 geolite.go:24: INFO GeoIP disabled: No paths were set under output.geoip.paths
2016/02/17 14:57:47.367997 logstash.go:106: INFO Max Retries set to: 3
2016/02/17 14:57:47.368019 client.go:90: DBG connect
2016/02/17 14:57:47.368478 outputs.go:119: INFO Activated logstash as output plugin.
2016/02/17 14:57:47.368508 outputs.go:119: INFO Activated console as output plugin.
2016/02/17 14:57:47.368520 publish.go:232: DBG Create output worker
2016/02/17 14:57:47.368553 publish.go:232: DBG Create output worker
2016/02/17 14:57:47.368593 publish.go:274: DBG No output is defined to store the topology. The server fields might not be filled.
2016/02/17 14:57:47.368638 publish.go:288: INFO Publisher name: febper2
2016/02/17 14:57:47.368840 async.go:78: INFO Flush Interval set to: 1s
2016/02/17 14:57:47.368858 async.go:84: INFO Max Bulk Size set to: 2048
2016/02/17 14:57:47.368870 async.go:92: DBG create bulk processing worker (interval=1s, bulk size=2048)
2016/02/17 14:57:47.368937 async.go:78: INFO Flush Interval set to: 1s
2016/02/17 14:57:47.368957 async.go:84: INFO Max Bulk Size set to: 2048
2016/02/17 14:57:47.368970 async.go:92: DBG create bulk processing worker (interval=1s, bulk size=2048)
2016/02/17 14:57:47.369039 beat.go:147: INFO Init Beat: filebeat; Version: 1.1.1
2016/02/17 14:57:47.369603 config.go:154: INFO Additional config files are fetched from: /home/elk/git-repo/bper-monitoring-elk/beat/filebeat/npv
2016/02/17 14:57:47.369722 config.go:132: INFO Additional configs loaded from: /home/elk/git-repo/bper-monitoring-elk/beat/filebeat/npv/filebeat_BE1.yml
2016/02/17 14:57:47.370016 config.go:132: INFO Additional configs loaded from: /home/elk/git-repo/bper-monitoring-elk/beat/filebeat/npv/filebeat_BE2.yml
2016/02/17 14:57:47.370243 config.go:132: INFO Additional configs loaded from: /home/elk/git-repo/bper-monitoring-elk/beat/filebeat/npv/filebeat_ESB1.yml
2016/02/17 14:57:47.370590 config.go:132: INFO Additional configs loaded from: /home/elk/git-repo/bper-monitoring-elk/beat/filebeat/npv/filebeat_ESB2.yml
2016/02/17 14:57:47.370850 config.go:132: INFO Additional configs loaded from: /home/elk/git-repo/bper-monitoring-elk/beat/filebeat/npv/filebeat_FE1.yml
2016/02/17 14:57:47.371024 config.go:132: INFO Additional configs loaded from: /home/elk/git-repo/bper-monitoring-elk/beat/filebeat/npv/filebeat_FE2.yml
it's funny beacuse it didn't give the error.
is unexpected that this:
./filebeat -configtest
gives:
2016/02/17 16:01:09 No paths given. What files do you want me to watch?
This seems to be a problem with parsing the multiline pattern if it is in quotes. We had the same issue and solved it via not quoting the multiline pattern in the config.
So your solution also could be to change
pattern: "^\d\d-[A-Za-z][A-Za-z][A-Za-z]-\d\d\d\d \d\d:\d\d:\d\d,\d\d\d.*"
to pattern: ^\d\d-[A-Za-z][A-Za-z][A-Za-z]-\d\d\d\d \d\d:\d\d:\d\d,\d\d\d.*
This seems to just happen only with the combination of globbing for paths.
which filebeat version are your using? The \d used in your pattern is a so called Perl character class not support by filebeat (support has been merged into master branch). Replace \d with [[:digit:]] or [0-9].
When removing the single quotes with '' is interpreted by the yaml parser itself turning your pattern into: ^dd-[A-Za-z][A-Za-z][A-Za-z]-dddd dd:dd:dd,ddd.*
so sry I missed that we are using this pattern right now in our configuration:
pattern: ^\[[[:digit:]]{4}-[[:digit:]]{2}-[[:digit:]]{2}[[:space:]]+[[:digit:]]{2}:[[:digit:]]{2}:[[:digit:]]{2},[[:digit:]]{3}\]
and this works as expected from filebeat documentation, may it would be valid with single quotes also. But I don't want to test this right now.
seems to work again also with doublequotes and correct escaping: pattern: "^\\[[[:digit:]]{4}-[[:digit:]]{2}-[[:digit:]]{2}[[:space:]]+[[:digit:]]{2}:[[:digit:]]{2}:[[:digit:]]{2},[[:digit:]]{3}\\]"
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.