File plugin path setup to multiple files


#1

I have a process that generates log files
aaalog_20160101
aaalog_20160102
aaalog_20160103

This config file did not work:
file {
path => [ "D:\a\b\d\e\f\aaalog_*" ]
start_position => beginning
sincedb_path => "D:\a\b\d\e\f\.sincedb_aaa"
}

When changed to
file {
path => [ "D:\a\b\d\e\f\aaalog_20160101" ]
start_position => beginning
sincedb_path => "D:\a\b\d\e\f\.sincedb_aaa"
}

obviously it picked only the first log.

Any remedy?


(Magnus B├Ąck) #2

Your first attempt should work; the file input supports wildcards. I'd try with forward slashes instead of backslashes. Increasing the log verbosity with --verbose will make Logstash report which files each wildcard expands to (as well as other useful information).


#3

Hi magnusbaeck

I tried the forward slashes. They worked as expected.

I am in a Windows 10.
The back slashes did not work with a wild card.
They worked for specific file paths and for the path for the .sincedb.
It is easy to copy and paste the paths in the config file.
Please, how can this be fixed?

Regards,

{:timestamp=>"2016-01-26T11:18:24.403000-0500", :message=>"Registering file input", :path=>["D:\\a\\b\\c\\d\\e\\aaalog_*"], :level=>:info}
{:timestamp=>"2016-01-26T11:18:24.413000-0500", :message=>"Worker threads expected: 4, worker threads started: 4", :level=>:info}
{:timestamp=>"2016-01-26T11:18:24.420000-0500", :message=>"Using mapping template from", :path=>nil, :level=>:info}
{:timestamp=>"2016-01-26T11:18:24.563000-0500", :message=>"Attempting to install template", :manage_template=>{"template"=>"logstash-*", "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"_default_"=>{"_all"=>{"enabled"=>true, "omit_norms"=>true}, "dynamic_templates"=>[{"message_field"=>{"match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"string", "index"=>"analyzed", "omit_norms"=>true, "fielddata"=>{"format"=>"disabled"}}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"string", "index"=>"analyzed", "omit_norms"=>true, "fielddata"=>{"format"=>"disabled"}, "fields"=>{"raw"=>{"type"=>"string", "index"=>"not_analyzed", "doc_values"=>true, "ignore_above"=>256}}}}}, {"float_fields"=>{"match"=>"*", "match_mapping_type"=>"float", "mapping"=>{"type"=>"float", "doc_values"=>true}}}, {"double_fields"=>{"match"=>"*", "match_mapping_type"=>"double", "mapping"=>{"type"=>"double", "doc_values"=>true}}}, {"byte_fields"=>{"match"=>"*", "match_mapping_type"=>"byte", "mapping"=>{"type"=>"byte", "doc_values"=>true}}}, {"short_fields"=>{"match"=>"*", "match_mapping_type"=>"short", "mapping"=>{"type"=>"short", "doc_values"=>true}}}, {"integer_fields"=>{"match"=>"*", "match_mapping_type"=>"integer", "mapping"=>{"type"=>"integer", "doc_values"=>true}}}, {"long_fields"=>{"match"=>"*", "match_mapping_type"=>"long", "mapping"=>{"type"=>"long", "doc_values"=>true}}}, {"date_fields"=>{"match"=>"*", "match_mapping_type"=>"date", "mapping"=>{"type"=>"date", "doc_values"=>true}}}, {"geo_point_fields"=>{"match"=>"*", "match_mapping_type"=>"geo_point", "mapping"=>{"type"=>"geo_point", "doc_values"=>true}}}], "properties"=>{"@timestamp"=>{"type"=>"date", "doc_values"=>true}, "@version"=>{"type"=>"string", "index"=>"not_analyzed", "doc_values"=>true}, "geoip"=>{"type"=>"object", "dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip", "doc_values"=>true}, "location"=>{"type"=>"geo_point", "doc_values"=>true}, "latitude"=>{"type"=>"float", "doc_values"=>true}, "longitude"=>{"type"=>"float", "doc_values"=>true}}}}}}}, :level=>:info}
{:timestamp=>"2016-01-26T11:18:24.653000-0500", :message=>"New Elasticsearch output", :class=>"LogStash::Outputs::ElasticSearch", :hosts=>["localhost:9200"], :level=>:info}
{:timestamp=>"2016-01-26T11:18:24.654000-0500", :message=>"Pipeline started", :level=>:info}
{:timestamp=>"2016-01-26T11:18:25.658000-0500", :message=>"Flushing buffer at interval", :instance=>"#<LogStash::Outputs::ElasticSearch::Buffer:0xca2025 @operations_mutex=#<Mutex:0xdc54c>, @max_size=500, @operations_lock=#<Java::JavaUtilConcurrentLocks::ReentrantLock:0x324b55>, @submit_proc=#<Proc:0x720ebc@C:/ELK/logstash-2.1.1/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.2.0-java/lib/logstash/outputs/elasticsearch/common.rb:55>, @logger=#<Cabin::Channel:0x6d9f9b @metrics=#<Cabin::Metrics:0xc500e0 @metrics_lock=#<Mutex:0x11f3038>, @metrics={}, @channel=#<Cabin::Channel:0x6d9f9b ...>>, @subscriber_lock=#<Mutex:0x116c789>, @level=:info, @subscribers={12592=>#<Cabin::Outputs::IO:0x1c1cafe @io=#<File:c:\\ELK\\debug\\LS\\LS_aaa_1.log>, @lock=#<Mutex:0x151ab8e>>}, @data={}>, @last_flush=2016-01-26 11:18:24 -0500, @flush_interval=1, @stopping=#<Concurrent::AtomicBoolean:0x3400e2>, @buffer=[], @flush_thread=#<Thread:0x1518fa5 run>>", :interval=>1, :level=>:info}
{:timestamp=>"2016-01-26T11:18:26.660000-0500", :message=>"Flushing buffer at interval", :instance=>"#<LogStash::Outputs::ElasticSearch::Buffer:0xca2025 @operations_mutex=#<Mutex:0xdc54c>, @max_size=500, @operations_lock=#<Java::JavaUtilConcurrentLocks::ReentrantLock:0x324b55>, @submit_proc=#<Proc:0x720ebc@C:/ELK/logstash-2.1.1/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.2.0-java/lib/logstash/outputs/elasticsearch/common.rb:55>, @logger=#<Cabin::Channel:0x6d9f9b @metrics=#<Cabin::Metrics:0xc500e0 @metrics_lock=#<Mutex:0x11f3038>, @metrics={}, @channel=#<Cabin::Channel:0x6d9f9b ...>>, @subscriber_lock=#<Mutex:0x116c789>, @level=:info, @subscribers={12592=>#<Cabin::Outputs::IO:0x1c1cafe @io=#<File:c:\\ELK\\debug\\LS\\LS_aaa_1.log>, @lock=#<Mutex:0x151ab8e>>}, @data={}>, @last_flush=2016-01-26 11:18:25 -0500, @flush_interval=1, @stopping=#<Concurrent::AtomicBoolean:0x

(system) #4