Logstash just sits and CPU Spikes

Good morning everyone.
Today I have downloaded Logstash, ElasticSearch and Kibana.

I have everything configured and its all displaying the way it should except for when I run my config file in Powershell. It just sits and hangs and doesnt do much. The CPU Spikes to 100% and stays there but no new "index" is created.

Here is my config file for those that wanna see what I am working with:

input {
file {
path => "X:/XXXXXXX/*.*"
start_position => "beginning"
sincedb_path => "NUL"
filter {
csv {
separator => ","
output {
elasticsearch {
hosts => "http://localhost:9200"
index => "XXXXX"
stdout { codec => rubydebug }

If anyone has any advice that would be great. I have tried changing the trailing on "X:/XXXXXXX/." to "X:\XXXXXXX*.*" and it just sits there also.

Here is the output I have from Powershell.

Windows PowerShell
Copyright (C) Microsoft Corporation. All rights reserved.

Try the new cross-platform PowerShell https://aka.ms/pscore6

PS C:\Windows\system32> D:\ELKSTACK\logstash\bin\logstash.bat -f D:\ELKSTACK\logstash\config\email.conf
Java HotSpot(TM) 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by org.jruby.runtime.encoding.EncodingService (file:/D:/ELKSTACK/logstash/logstash-core/lib/jars/jruby-complete- to field java.io.Console.cs
WARNING: Please consider reporting this to the maintainers of org.jruby.runtime.encoding.EncodingService
WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
WARNING: All illegal access operations will be denied in a future release
Sending Logstash logs to D:/ELKSTACK/logstash/logs which is now configured via log4j2.properties
[2019-06-24T10:00:53,477][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2019-06-24T10:00:53,489][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"7.1.1"}
[2019-06-24T10:00:57,160][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://localhost:9200/]}}
[2019-06-24T10:00:57,334][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://localhost:9200/"}
[2019-06-24T10:00:57,374][INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>7}
[2019-06-24T10:00:57,376][WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>7}
[2019-06-24T10:00:57,397][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["http://localhost:9200"]}
[2019-06-24T10:00:57,406][INFO ][logstash.outputs.elasticsearch] Using default mapping template
[2019-06-24T10:00:57,416][INFO ][logstash.javapipeline    ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>8, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>1000, :thread=>"#<Thread:0x64d91542 run>"}
[2019-06-24T10:00:57,506][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"index_patterns"=>"logstash-*", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s", "number_of_shards"=>1}, "mappings"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}
[2019-06-24T10:00:57,705][INFO ][logstash.javapipeline    ] Pipeline started {"pipeline.id"=>"main"}
[2019-06-24T10:00:57,732][INFO ][filewatch.observingtail  ] START, creating Discoverer, Watch with file and sincedb collections
[2019-06-24T10:00:57,735][INFO ][logstash.agent           ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[2019-06-24T10:00:57,979][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}

You definitely want forward slash and not backslash in the path option of a file filter. You could use the hot_threads API to try and get an idea of what logstash is doing. Also '--log.level trace' (not debug) will cause filewatch to log which files it is reading data.

It would probably be better if ran this on my linux machine.

NM Linux is out. I will try again here shortly

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.