The signal HUP is in use by the JVM and will not work correctly on this platform

I just installed LogStash 2.4.0 on Windows 7 to try out. But got this HUP error when trying to load data from a CSV file. The CSV file is downloaded from http://data.donorschoose.org/open-data/overview/ (the donation data)
(followed this post http://www.rittmanmead.com/blog/2015/04/using-the-elk-stack-to-analyse-donors-choose-data/)

The config file

input {  
    file {  
        start_position => beginning  
        path => ["c:\dev\ELK\demo\data\opendata_donations.csv"]  
		type =>"csv"
    }
}

filter {  
        csv {separator => ","  
                columns => ["_donationid","_projectid","_donor_acctid","_cartid","donor_city","donor_state","donor_zip","is_teacher_acct","donation_timestamp","donation_to_project","donation_optional_support","donation_total","dollar_amount","donation_included_optional_support","payment_method","payment_included_acct_credit","payment_included_campaign_gift_card","payment_included_web_purchased_gift_card","payment_was_promo_matched","via_giving_page","for_honoree","donation_message"]
			}
}

output {  
        elasticsearch { hosts => "localhost:9200" index => "opendata" index_type => "donations"}  
        }

The command line and the output:

C:\dev\ELK\demo\logstash\bin>logstash.bat --verbose -f c:\\logstash.conf
←[32mstarting agent {:level=>:info}←[0m
←[32mstarting pipeline {:id=>"main", :level=>:info}←[0m
Settings: Default pipeline workers: 8
←[32mRegistering file input {:path=>["c:\\dev\\ELK\\demo\\data\\opendata_donatio
ns.csv"], :level=>:info}←[0m
←[32mNo sincedb_path set, generating one based on the file path {:sincedb_path=>
"C:\\Users\\rockdale/.sincedb_42bf1ac4b6f27b058d61df780b7a7e23", :path=>["c:\\dev\\
ELK\\demo\\data\\opendata_donations.csv"], :level=>:info}←[0m
←[31mPipeline aborted due to error {:exception=>"LogStash::ConfigurationError",
:backtrace=>["C:/dev/ELK/demo/logstash/vendor/bundle/jruby/1.9/gems/logstash-cor
e-2.4.0-java/lib/logstash/config/mixin.rb:88:in `config_init'", "org/jruby/RubyH
ash.java:1342:in `each'", "C:/dev/ELK/demo/logstash/vendor/bundle/jruby/1.9/gems
/logstash-core-2.4.0-java/lib/logstash/config/mixin.rb:72:in `config_init'", "C:
/dev/ELK/demo/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.0-java/lib
/logstash/outputs/base.rb:79:in `initialize'", "C:/dev/ELK/demo/logstash/vendor/
bundle/jruby/1.9/gems/logstash-core-2.4.0-java/lib/logstash/output_delegator.rb:
74:in `register'", "C:/dev/ELK/demo/logstash/vendor/bundle/jruby/1.9/gems/logsta
sh-core-2.4.0-java/lib/logstash/pipeline.rb:181:in `start_workers'", "org/jruby/
RubyArray.java:1613:in `each'", "C:/dev/ELK/demo/logstash/vendor/bundle/jruby/1.
9/gems/logstash-core-2.4.0-java/lib/logstash/pipeline.rb:181:in `start_workers'"
, "C:/dev/ELK/demo/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.0-jav
a/lib/logstash/pipeline.rb:136:in `run'", "C:/dev/ELK/demo/logstash/vendor/bundl
e/jruby/1.9/gems/logstash-core-2.4.0-java/lib/logstash/agent.rb:491:in `start_pi
peline'"], :level=>:error}←[0m
stopping pipeline {:id=>"main"}
←[32mClosing inputs {:level=>:info}←[0m
←[32mClosed inputs {:level=>:info}←[0m
The signal HUP is in use by the JVM and will not work correctly on this platform

Read some posts with the same error but none of the changes solved the problem. It tells me it is a configuration error but been new to ELK I could not figure out what's wrong with my config file. Can anyone help? What am I missing?

Thanks in advance

Are you sure there's no human-readable error message in the vicinity of :exception=>"LogStash::ConfigurationError"? Could you paste the whole log again and make sure it's formatted as preformatted text (use the </> toolbar button) to make sure nothing's lost.

Possibly unrelated, but you may want to use forward slashes in your file paths instead of backslashes.

Thank you!

Nope. There is no further information around the ConfigurationError.

I piped the output into a text file:

the command line and console output
C:\dev\ELK\demo\logstash\bin>logstash.bat --verbose -f c:\logstash.conf > err.txt
The signal HUP is in use by the JVM and will not work correctly on this platform

the content in err.txt

{:timestamp=>"2016-09-19T15:59:34.886000-0400", :message=>"starting agent", :level=>:info}
{:timestamp=>"2016-09-19T15:59:34.890000-0400", :message=>"starting pipeline", :id=>"main", :level=>:info}
{:timestamp=>"2016-09-19T15:59:35.323000-0400", :message=>"Registering file input", :path=>["c:/dev/ELK/demo/data/opendata_donations.csv"], :level=>:info}
{:timestamp=>"2016-09-19T15:59:35.325000-0400", :message=>"No sincedb_path set, generating one based on the file path", :sincedb_path=>"C:\\Users\\rockdale/.sincedb_6df5cc0cbc179816823eecb22a5b736c", :path=>["c:/dev/ELK/demo/data/opendata_donations.csv"], :level=>:info}
{:timestamp=>"2016-09-19T15:59:35.331000-0400", :message=>"Pipeline aborted due to error", :exception=>"LogStash::ConfigurationError", :backtrace=>["C:/dev/ELK/demo/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.0-java/lib/logstash/config/mixin.rb:88:in `config_init'", "org/jruby/RubyHash.java:1342:in `each'", "C:/dev/ELK/demo/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.0-java/lib/logstash/config/mixin.rb:72:in `config_init'", "C:/dev/ELK/demo/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.0-java/lib/logstash/outputs/base.rb:79:in `initialize'", "C:/dev/ELK/demo/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.0-java/lib/logstash/output_delegator.rb:74:in `register'", "C:/dev/ELK/demo/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.0-java/lib/logstash/pipeline.rb:181:in `start_workers'", "org/jruby/RubyArray.java:1613:in `each'", "C:/dev/ELK/demo/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.0-java/lib/logstash/pipeline.rb:181:in `start_workers'", "C:/dev/ELK/demo/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.0-java/lib/logstash/pipeline.rb:136:in `run'", "C:/dev/ELK/demo/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.0-java/lib/logstash/agent.rb:491:in `start_pipeline'"], :level=>:error}
{:timestamp=>"2016-09-19T15:59:38.334000-0400", :message=>"stopping pipeline", :id=>"main"}
{:timestamp=>"2016-09-19T15:59:38.337000-0400", :message=>"Closing inputs", :level=>:info}
{:timestamp=>"2016-09-19T15:59:38.340000-0400", :message=>"Closed inputs", :level=>:info}

the logstash.conf (changed the slash but does not make any difference)

input {  
    file {  
        start_position => beginning  
        path => ["c:/dev/ELK/demo/data/opendata_donations.csv"]  
		type =>"csv"
    }
}

filter {  
        csv {separator => ","  
                columns => ["_donationid","_projectid","_donor_acctid","_cartid","donor_city","donor_state","donor_zip","is_teacher_acct","donation_timestamp","donation_to_project","donation_optional_support","donation_total","dollar_amount","donation_included_optional_support","payment_method","payment_included_acct_credit","payment_included_campaign_gift_card","payment_included_web_purchased_gift_card","payment_was_promo_matched","via_giving_page","for_honoree","donation_message"]
			}
}

output {  
        elasticsearch { hosts => ["localhost:9200"] index => "opendata" index_type => "donations"}  
        }

Is there a easy way to get some sample data into logstash? I just want to have enough data to get me started on a prototype to show our user.

Again, thanks

I repeat: Could you paste the whole log again and make sure it's formatted as preformatted text.

Please also post the configuration file in the same way.

Sorry I did not make it clear. That's the whole log when I run it the second time. It is no difference from the first run. It is whatever appears on the console. Just on the second run, I piped the output to a text file (instead output it to the console). Or you are talking about other logs that I am not aware of?

That's the whole config file too.

Just find out what do you mean by preformatted

Here is the preformatted version:

config file:

input {  
    file {  
        start_position => beginning  
        path => ["c:/dev/ELK/demo/data/opendata_donations.csv"]  
		type =>"csv"
    }
}

filter {  
        csv {separator => ","  
                columns => ["_donationid","_projectid","_donor_acctid","_cartid","donor_city","donor_state","donor_zip","is_teacher_acct","donation_timestamp","donation_to_project","donation_optional_support","donation_total","dollar_amount","donation_included_optional_support","payment_method","payment_included_acct_credit","payment_included_campaign_gift_card","payment_included_web_purchased_gift_card","payment_was_promo_matched","via_giving_page","for_honoree","donation_message"]
			}
}

output {  
        elasticsearch { hosts => ["localhost:9200"] index => "opendata" index_type => "donations"}  
        }

output on console:

{:timestamp=>"2016-09-19T15:59:34.886000-0400", :message=>"starting agent", :level=>:info}
{:timestamp=>"2016-09-19T15:59:34.890000-0400", :message=>"starting pipeline", :id=>"main", :level=>:info}
{:timestamp=>"2016-09-19T15:59:35.323000-0400", :message=>"Registering file input", :path=>["c:/dev/ELK/demo/data/opendata_donations.csv"], :level=>:info}
{:timestamp=>"2016-09-19T15:59:35.325000-0400", :message=>"No sincedb_path set, generating one based on the file path", :sincedb_path=>"C:\\Users\\rockdale/.sincedb_6df5cc0cbc179816823eecb22a5b736c", :path=>["c:/dev/ELK/demo/data/opendata_donations.csv"], :level=>:info}
{:timestamp=>"2016-09-19T15:59:35.331000-0400", :message=>"Pipeline aborted due to error", :exception=>"LogStash::ConfigurationError", :backtrace=>["C:/dev/ELK/demo/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.0-java/lib/logstash/config/mixin.rb:88:in `config_init'", "org/jruby/RubyHash.java:1342:in `each'", "C:/dev/ELK/demo/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.0-java/lib/logstash/config/mixin.rb:72:in `config_init'", "C:/dev/ELK/demo/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.0-java/lib/logstash/outputs/base.rb:79:in `initialize'", "C:/dev/ELK/demo/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.0-java/lib/logstash/output_delegator.rb:74:in `register'", "C:/dev/ELK/demo/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.0-java/lib/logstash/pipeline.rb:181:in `start_workers'", "org/jruby/RubyArray.java:1613:in `each'", "C:/dev/ELK/demo/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.0-java/lib/logstash/pipeline.rb:181:in `start_workers'", "C:/dev/ELK/demo/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.0-java/lib/logstash/pipeline.rb:136:in `run'", "C:/dev/ELK/demo/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.0-java/lib/logstash/agent.rb:491:in `start_pipeline'"], :level=>:error}
{:timestamp=>"2016-09-19T15:59:38.334000-0400", :message=>"stopping pipeline", :id=>"main"}
{:timestamp=>"2016-09-19T15:59:38.337000-0400", :message=>"Closing inputs", :level=>:info}
{:timestamp=>"2016-09-19T15:59:38.340000-0400", :message=>"Closed inputs", :level=>:info}

Thank you for your help

Before you start Logstash, you can test the configuration file with the parameter -t.

You can find more information at this page https://www.elastic.co/guide/en/logstash/current/command-line-flags.html

Thank you for the tip and sorry I did not get back to this issue earlier.

I tried this command:

C:\dev\ELK\demo\logstash\bin>logstash.bat -t c:\logstash.conf --verbose
Configuration OK

I played around a little.

I change output from elasticsearch to stdout. It works fine. I can see my data on the console.

Once I remove the index_type portion from

output {
elasticsearch { hosts => ["localhost:9200"] index => "opendata" index_type => "donations"}
}

It does not show the HUP error and seems the data is loading....