Setting and testing logstash and elastic search intsallation

Hi,

I am trying to load a csv file into logtsash and output to elastic search.

i have read some of the topics and my error is similar to them.

I dont think index has been created.

what should i do?

I don't see any error messages on the screenshot.

? Did you check ES if it had the index? Did you check ES's logs?
How does your Logstash config look like? Are you actually feeding Logstash? Index will be created once there is data loaded in it.

I am trying to load data from a CSV file
data is like this:
11-Aug,123,eXXX,200,bXXX,404_TRN_IDENTIFIER_NOT_FOUND,XX-number (identifier) provided not found

My colleagure was able to load it- his stack is on mac in a docker. i am on windows and we have restrictions on our C drive. I have elastic search loaded as a service and is in c: program files. logstash and Kibana were on my downloads folder- now, i am going to try and move them to my documents and then retry.

here is my log:
yes, index hasnt been created.

when i ran in debug mode: I get the following error:
[2018-04-30T17:11:07,265][DEBUG][logstash.inputs.file ] _globbed_files: C:\Users\rbradley\Documents\logstash-6.2.4\logstash-6.2.4\informatica.log: glob is: []
[2018-04-30T17:11:07,267][DEBUG][logstash.inputs.file ] _globbed_files: C:\Users\rbradley\Documents\logstash-6.2.4\logstash-6.2.4\informatica.log: glob is: ["C:\Users\rbradley\Documents\logstash-6.2.4\logstash-6.2.4\informatica.log"] because glob did not work

here is where it says skipping files:
[2018-04-30T17:10:41,534][DEBUG][logstash.config.source.local.configpathloader] Skipping the following files while reading config since they don't match the specified glob pattern {:files=>["C:/Users/rbradley/Documents/logstash-6.2.4/logstash-6.2.4/CONTRIBUTORS", "C:/Users/rbradley/Documents/logstash-6.2.4/logstash-6.2.4/Gemfile", "C:/Users/rbradley/Documents/logstash-6.2.4/logstash-6.2.4/Gemfile.lock", "C:/Users/rbradley/Documents/logstash-6.2.4/logstash-6.2.4/LICENSE", "C:/Users/rbradley/Documents/logstash-6.2.4/logstash-6.2.4/NOTICE.TXT", "C:/Users/rbradley/Documents/logstash-6.2.4/logstash-6.2.4/bin", "C:/Users/rbradley/Documents/logstash-6.2.4/logstash-6.2.4/config", "C:/Users/rbradley/Documents/logstash-6.2.4/logstash-6.2.4/data", "C:/Users/rbradley/Documents/logstash-6.2.4/logstash-6.2.4/informatica.log", "C:/Users/rbradley/Documents/logstash-6.2.4/logstash-6.2.4/lib", "C:/Users/rbradley/Documents/logstash-6.2.4/logstash-6.2.4/logs", "C:/Users/rbradley/Documents/logstash-6.2.4/logstash-6.2.4/logstash-core", "C:/Users/rbradley/Documents/logstash-6.2.4/logstash-6.2.4/logstash-core-plugin-api", "C:/Users/rbradley/Documents/logstash-6.2.4/logstash-6.2.4/modules", "C:/Users/rbradley/Documents/logstash-6.2.4/logstash-6.2.4/sample-errors-test.csv", "C:/Users/rbradley/Documents/logstash-6.2.4/logstash-6.2.4/tools", "C:/Users/rbradley/Documents/logstash-6.2.4/logstash-6.2.4/vendor"]}
[2018-04-30T17:10:41,561][DEBUG][logstash.api.service ] [api-service] start
[2018-04-30T17:10:41,567][DEBUG][logstash.config.source.local.configpathloader] Reading config file {:config_file=>"C:/Users/rbradley/Documents/logstash-6.2.4/logstash-6.2.4/logstash-simple.conf"}

trying to access a template:
[2018-04-30T17:10:50,436][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil}
[2018-04-30T17:10:50,519][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"default"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
[2018-04-30T17:10:50,654][DEBUG][logstash.outputs.elasticsearch] Found existing Elasticsearch template. Skipping template management {:name=>"logstash"}
[2018-04-30T17:10:50,682][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//localhost:9200"]}
[2018-04-30T17:10:51,103][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}

glob error
[2018-04-30T17:10:52,804][DEBUG][logstash.inputs.file ] _globbed_files: C:\Users\rbradley\Documents\logstash-6.2.4\logstash-6.2.4\informatica.log: glob is: []
[2018-04-30T17:10:52,815][DEBUG][logstash.inputs.file ] _globbed_files: C:\Users\rbradley\Documents\logstash-6.2.4\logstash-6.2.4\informatica.log: glob is: ["C:\Users\rbradley\Documents\logstash-6.2.4\logstash-6.2.4\informatica.log"] because glob did not work
[2018-04-30T17:10:52,847][DEBUG][logstash.inputs.file ] _discover_file: C:\Users\rbradley\Documents\logstash-6.2.4\logstash-6.2.4\informatica.log: new: C:\Users\rbradley\Documents\logstash-6.2.4\logstash-6.2.4\informatica.log (exclude is [])

[2018-04-30T17:10:52,804][DEBUG][logstash.inputs.file ] _globbed_files: C:\Users\rbradley\Documents\logstash-6.2.4\logstash-6.2.4\informatica.log: glob is:

Either the filename pattern is wrong or the user that Logstash runs as doesn't have sufficient permissions. Do you really have two nested logstash-6.2.4 directories?

yes i do..yes, a part of me thinks it may be access issue. what permissions you should have to be able to run logstash?

I found your reply on someone else' question- is this still valid? The user directory is mode drwxrw-rw- (766) but it's important that directory is executable (drwxrwxrwx, 777). It does however not need to be writable so what you're really looking for is (drwxr-xr-x, 755). Same thing with the Desktop directory, it doesn't have to be writable to anyone but you.

should i get my helpdesk to check my access to the directory properly?

The file is good as my colleague is able to execute on his docker on mac

okay, I have corrected my glob error by changing slash to path => "C:/ instead of path => "C:\

now in the debug mode- it seems to be
a) has a timeut exception
[2018-04-30T21:10:25,967][DEBUG][logstash.instrument.periodicpoller.persistentqueue] Timeout exception {:poller=>#<LogStash::Instrument::PeriodicPoller::PersistentQueue:0x397aab54 @queue_type="memory", @agent=#<LogStash::Agent:0x239a215 @dispatcher=#<LogStash::EventDispatcher:0xeecbb2b @emitter=#<LogStash::Agent:0x239a215 ...>, @listeners=<Java::JavaUtilConcurrent::CopyOnWriteArraySet:0 []>>, @id_path="c:/Users/rbradley/Documents/logstash-6.2.4/logstash-6.2.4/data/uuid", @running=<#Concurrent::AtomicBoolean:0xfb0 value:true>, @periodic_pollers=#<LogStash::Instrument::PeriodicPollers:0x3aebc259 @metric=#<LogStash::Instrument::Metric:0x41d47bb3 @collector=#<LogStash::Instrument::Collector:0x1a277e6b @agent=nil, @metric_store=#<LogStash::Instrument::MetricStore:0x40548c43 @store=#<Concurrent::map:0x00000000000fb4 entries=3 default_proc=nil>, @structured_lookup_mutex=#Mutex:0xbc294cc, @fast_lookup=#<Concurrent::map:0x00000000000fb8 entries=64 default_proc=nil>>>>, @periodic_pollers=[#<LogStash::Instrument::PeriodicPoller::Os:0x7efceb9b @metric=#<LogStash::Instrument::Metric:0x41d47bb3 @collector=#<LogStash::Instrument::Collector:0x1a277e6b @agent=nil, @metric_store=#<LogStash::Instrument::MetricStore:0x40548c43 @store=#<Concurrent::map:0x00000000000fb4 entries=3 default_proc=nil>, @structured_lookup_mutex=#Mutex:0xbc294cc, @fast_lookup=#<Concurrent::map:0x00000000000fb8 entries=64 default_proc=nil>>>>, @task=#<Concurrent::TimerTask:0x16473fd5 @observers=#<Concurrent::Collection::CopyOnNotifyObserverSet:0x3bce6dfb @observers={#<LogStash::Instrument::PeriodicPoller::Os:0x7efceb9b ...>=>:update}>, @timeout_interval=120.0, @running=<#Concurrent::AtomicBoolean:0xfbc value:true>, @StoppedEvent=#<Concurr

and in normal mode, it is:
c:\Users\rbradley\Documents\logstash-6.2.4\logstash-6.2.4\bin>logstash -f c:\Users\rbradley\Documents\logstash-6.2.4\logstash-6.2.4\logstash-simple.conf
Sending Logstash's logs to c:/Users/rbradley/Documents/logstash-6.2.4/logstash-6.2.4/logs which is now configured via log4j2.properties
[2018-04-30T20:52:04,185][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"fb_apache", :directory=>"c:/Users/rbradley/Documents/logstash-6.2.4/logstash-6.2.4/modules/fb_apache/configuration"}
[2018-04-30T20:52:04,211][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"netflow", :directory=>"c:/Users/rbradley/Documents/logstash-6.2.4/logstash-6.2.4/modules/netflow/configuration"}
[2018-04-30T20:52:04,625][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2018-04-30T20:52:05,446][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"6.2.4"}
[2018-04-30T20:52:06,855][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
[2018-04-30T20:52:13,544][INFO ][logstash.pipeline ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50}
[2018-04-30T20:52:14,220][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://localhost:9200/]}}
[2018-04-30T20:52:14,244][INFO ][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>http://localhost:9200/, :path=>"/"}
[2018-04-30T20:52:14,504][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://localhost:9200/"}
[2018-04-30T20:52:14,616][INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>6}
[2018-04-30T20:52:14,626][WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: the type event field won't be used to determine the document _type {:es_version=>6}
[2018-04-30T20:52:14,661][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil}
[2018-04-30T20:52:14,695][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"default"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
[2018-04-30T20:52:14,782][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//localhost:9200"]}
[2018-04-30T20:52:16,173][INFO ][logstash.pipeline ] Pipeline started successfully {:pipeline_id=>"main", :thread=>"#<Thread:0x53edefaf run>"}
[2018-04-30T20:52:16,370][INFO ][logstash.agent ] Pipelines running {:count=>1, :pipelines=>["main"]}

b) b) going in a loop:
[2018-04-30T20:34:06,002][INFO ][logstash.pipeline ] Pipeline started successfully {:pipeline_id=>"main", :thread=>"#<Thread:0x4618bd13 run>"}
[2018-04-30T20:34:06,224][INFO ][logstash.agent ] Pipelines running {:count=>1, :pipelines=>["main"]}
[2018-04-30T20:34:06,307][DEBUG][logstash.inputs.file ] _globbed_files: C:/Users/rbradley/Documents/logstash-6.2.4/logstash-6.2.4/informatica.log: glob is: ["C:/Users/rbradley/Documents/logstash-6.2.4/logstash-6.2.4/informatica.log"]
[2018-04-30T20:34:06,318][DEBUG][logstash.inputs.file ] _discover_file: C:/Users/rbradley/Documents/logstash-6.2.4/logstash-6.2.4/informatica.log: new: C:/Users/rbradley/Documents/logstash-6.2.4/logstash-6.2.4/informatica.log (exclude is [])
[2018-04-30T20:34:06,459][DEBUG][logstash.inputs.file ] _open_file: C:/Users/rbradley/Documents/logstash-6.2.4/logstash-6.2.4/informatica.log: opening
[2018-04-30T20:34:06,496][DEBUG][logstash.inputs.file ] C:/Users/rbradley/Documents/logstash-6.2.4/logstash-6.2.4/informatica.log: initial create, no sincedb, seeking to end 6060
[2018-04-30T20:34:06,549][DEBUG][logstash.inputs.file ] each: file grew: C:/Users/rbradley/Documents/logstash-6.2.4/logstash-6.2.4/informatica.log: old size 0, new size 6060
[2018-04-30T20:34:07,565][DEBUG][logstash.inputs.file ] each: file grew: C:/Users/rbradley/Documents/logstash-6.2.4/logstash-6.2.4/informatica.log: old size 0, new size 6060
[2018-04-30T20:34:08,574][DEBUG][logstash.inputs.file ] each: file grew: C:/Users/rbradley/Documents/logstash-6.2.4/logstash-6.2.4/informatica.log: old size 0, new size 6060
[2018-04-30T20:34:09,585][DEBUG][logstash.inputs.file ] each: file grew: C:/Users/rbradley/Documents/logstash-6.2.4/logstash-6.2.4/informatica.log: old size 0, new size 6060
[2018-04-30T20:34:10,594][DEBUG][logstash.inputs.file ] each: file grew: C:/Users/rbradley/Documents/logstash-6.2.4/logstash-6.2.4/informatica.log: old size 0, new size 6060
[2018-04-30T20:34:10,737][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2018-04-30T20:34:10,743][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2018-04-30T20:34:11,039][DEBUG][logstash.pipeline ] Pushing flush onto pipeline {:pipeline_id=>"main", :thread=>"#<Thread:0x4618bd13 sleep>"}
[2018-04-30T20:34:11,601][DEBUG][logstash.inputs.file ] each: file grew: C:/Users/rbradley/Documents/logstash-6.2.4/logstash-6.2.4/informatica.log: old size 0, new size 6060
[2018-04-30T20:34:12,604][DEBUG][logstash.inputs.file ] each: file grew: C:/Users/rbradley/Documents/logstash-6.2.4/logstash-6.2.4/informatica.log: old size 0, new size 6060
[2018-04-30T20:34:13,612][DEBUG][logstash.inputs.file ] each: file grew: C:/Users/rbradley/Documents/logstash-6.2.4/logstash-6.2.4/informatica.log: old size 0, new size 6060
[2018-04-30T20:34:14,620][DEBUG][logstash.inputs.file ] each: file grew: C:/Users/rbradley/Documents/logstash-6.2.4/logstash-6.2.4/informatica.log: old size 0, new size 6060
[2018-04-30T20:34:15,630][DEBUG][logstash.inputs.file ] each: file grew: C:/Users/rbradley/Documents/logstash-6.2.4/logstash-6.2.4/informatica.log: old size 0, new size 6060
[2018-04-30T20:34:15,783][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2018-04-30T20:34:15,786][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2018-04-30T20:34:16,047][DEBUG][logstash.pipeline ] Pushing flush onto pipeline {:pipeline_id=>"main", :thread=>"#<Thread:0x4618bd13 sleep>"}
[2018-04-30T20:34:16,637][DEBUG][logstash.inputs.file ] each: file grew: C:/Users/rbradley/Documents/logstash-6.2.4/logstash-6.2.4/informatica.log: old size 0, new size 6060
[2018-04-30T20:34:17,642][DEBUG][logstash.inputs.file ] each: file grew: C:/Users/rbradley/Documents/logstash-6.2.4/logstash-6.2.4/informatica.log: old size 0, new size 6060
[2018-04-30T20:34:18,650][DEBUG][logstash.inputs.file ] each: file grew: C:/Users/rbradley/Documents/logstash-6.2.4/logstash-6.2.4/informatica.log: old size 0, new size 6060
[2018-04-30T20:34:19,660][DEBUG][logstash.inputs.file ] each: file grew: C:/Users/rbradley/Documents/logstash-6.2.4/logstash-6.2.4/informatica.log: old size 0, new size 6060
[2018-04-30T20:34:20,665][DEBUG][logstash.inputs.file ] each: file grew: C:/Users/rbradley/Documents/logstash-6.2.4/logstash-6.2.4/informatica.log: old size 0, new size 6060
[2018-04-30T20:34:20,677][DEBUG][logstash.inputs.file ] _globbed_files: C:/Users/rbradley/Documents/logstash-6.2.4/logstash-6.2.4/informatica.log: glob is: ["C:/Users/rbradley/Documents/logstash-6.2.4/logstash-6.2.4/informatica.log"]
[2018-04-30T20:34:20,801][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2018-04-30T20:34:20,803][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2018-04-30T20:34:21,049][DEBUG][logstash.pipeline ] Pushing flush onto pipeline {:pipeline_id=>"main", :thread=>"#<Thread:0x4618bd13 sleep>"}
[2018-04-30T20:34:21,686][DEBUG][logstash.inputs.file ] each: file grew: C:/Users/rbradley/Documents/logstash-6.2.4/logstash-6.2.4/informatica.log: old size 0, new size 6060
[2018-04-30T20:34:22,690][DEBUG][logstash.inputs.file ] each: file grew: C:/Users/rbradley/Documents/logstash-6.2.4/logstash-6.2.4/informatica.log: old size 0, new size 6060
[2018-04-30T20:34:23,695][DEBUG][logstash.inputs.file ] each: file grew: C:/Users/rbradley/Documents/logstash-6.2.4/logstash-6.2.4/informatica.log: old size 0, new size 6060
[2018-04-30T20:34:24,703][DEBUG][logstash.inputs.file ] each: file grew: C:/Users/rbradley/Documents/logstash-6.2.4/logstash-6.2.4/informatica.log: old size 0, new size 6060
[2018-04-30T20:34:25,709][DEBUG][logstash.inputs.file ] each: file grew: C:/Users/rbradley/Documents/logstash-6.2.4/logstash-6.2.4/informatica.log: old size 0, new size 6060

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.