Trying to load Configuration file this error has arrives in Logstash-5.0.0-alpha-4 CMD on windows

C:\logstash-5.0.0-alpha4>bin\logstash agent -f logstash.conf
--- jar coordinate com.fasterxml.jackson.core:jackson-annotations already loaded with version 2.7.1 - omit version 2.7.0
--- jar coordinate com.fasterxml.jackson.core:jackson-databind already loaded with version 2.7.1 - omit version 2.7.1-1
ERROR: too many arguments

See: 'bin/logstash --help'

C:\logstash-5.0.0-alpha4>bin\logstash agent -f logstash.conf --debug
--- jar coordinate com.fasterxml.jackson.core:jackson-annotations already loaded with version 2.7.1 - omit version 2.7.0
--- jar coordinate com.fasterxml.jackson.core:jackson-databind already loaded with version 2.7.1 - omit version 2.7.1-1
ERROR: too many arguments

Can anyone help me to resolve this issue on windows platform?

Try without the agent parameter, as I believe this is deprecated. The list of available command line flags can be found here.

Without agent parameter this error is arrive

C:\logstash-5.0.0-alpha4>bin\logstash -f logstash.conf
--- jar coordinate com.fasterxml.jackson.core:jackson-annotations already loaded with version 2.7.1 - omit version 2.7.0
--- jar coordinate com.fasterxml.jackson.core:jackson-databind already loaded with version 2.7.1 - omit version 2.7.1-1
Unknown setting 'host' for elasticsearch {:level=>:error}
Unknown setting 'protocol' for elasticsearch {:level=>:error}
fetched an invalid config {:config=>"input {\n file{\n path => ["C:\logstash-5.0.0-alpha4\mock_data.json"]\n type => "json"\n start_position => "beginning"\n sincedb_path => "/dev/null"\n }\n}\nfilter {\n grok {\n match => [ 'message', '(?\"id\":.*\"longitude\":\"[^"]+\")' ]\n add_field => [ "json_body", "{%{body}}" ]\n }\n json {\n source => "json_body"\n remove_field => ["message","body","json_body" ]\n }\n mutate {\n add_field => ["[geoip][location]","%{[latitude]}"]\n add_field => ["[geoip][location]","%{[longitude]}"]\n }\n mutate {\n convert => [ "[geoip][location]", "float"]\n }\n}\n\noutput {\n stdout {\n codec => rubydebug\n } \n elasticsearch {\n host => "127.0.0.1"\n protocol => "http"\n # Needs logstash-sth format (logstash-2015.05.06) as an index\n # else geo_point type becomes primitive(String or number)\n index => "logstash-json"\n }\n} \n", :reason=>"Something is wrong with your configuration.", :level=>:error}
The signal HUP is in use by the JVM and will not work correctly on this platform

1 Like

C:\logstash-2.3.4>bin\logstash -f logstash.conf
io/console not supported; tty will not be manipulated
Settings: Default pipeline workers: 4
Pipeline aborted due to error {:exception=>#<LogStash::ConfigurationError: The setting host in plugin elasticsearch is obsolete and is no longer available. Please use the 'hosts' setting instead. You can specify multiple entries separated by comma in 'host:port' format. If you have any questions about this, you are invited to visit https://discuss.elastic.co/c/logstash and ask.>, :backtrace=>["C:/logstash-2.3.4/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.4-java/lib/logstash/config/mixin.rb:87:in config_init'", "org/jruby/RubyHash.java:1342:ineach'", "C:/logstash-2.3.4/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.4-java/lib/logstash/config/mixin.rb:71:in config_init'", "C:/logstash-2.3.4/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.4-java/lib/logstash/outputs/base.rb:63:ininitialize'", "C:/logstash-2.3.4/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.4-java/lib/logstash/output_delegator.rb:74:in register'", "C:/logstash-2.3.4/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.4-java/lib/logstash/pipeline.rb:181:instart_workers'", "org/jruby/RubyArray.java:1613:in each'", "C:/logstash-2.3.4/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.4-java/lib/logstash/pipeline.rb:181:instart_workers'", "C:/logstash-2.3.4/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.4-java/lib/logstash/pipeline.rb:136:in run'", "C:/logstash-2.3.4/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.4-java/lib/logstash/agent.rb:473:instart_pipeline'"], :level=>:error}
stopping pipeline {:id=>"main"}
The signal HUP is in use by the JVM and will not work correctly on this platform

Pretty obvious what the problem is ^

After changing 'hosts' instead of 'host' issue not resolve

C:\logstash-2.3.4>bin\logstash -f logstash.conf
io/console not supported; tty will not be manipulated
Settings: Default pipeline workers: 4
Unknown setting 'protocol' for elasticsearch {:level=>:error}
Pipeline aborted due to error {:exception=>#<LogStash::ConfigurationError: Something is wrong with your configuration.>, :backtrace=>["C:/logstash-2.3.4/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.4-java/lib/logstash/config/mixin.rb:134:in config_init'", "C:/logstash-2.3.4/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.4-java/lib/logstash/outputs/base.rb:63:ininitialize'", "C:/logstash-2.3.4/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.4-java/lib/logstash/output_delegator.rb:74:in register'", "C:/logstash-2.3.4/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.4-java/lib/logstash/pipeline.rb:181:instart_workers'", "org/jruby/RubyArray.java:1613:in each'", "C:/logstash-2.3.4/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.4-java/lib/logstash/pipeline.rb:181:instart_workers'", "C:/logstash-2.3.4/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.4-java/lib/logstash/pipeline.rb:136:in run'", "C:/logstash-2.3.4/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.4-java/lib/logstash/agent.rb:473:instart_pipeline'"], :level=>:error}
stopping pipeline {:id=>"main"}
The signal HUP is in use by the JVM and will not work correctly on this platform

Providing your config would probably be useful.

That's my configuration file which I want to load

input {
file{
path => ["C:\logstash-2.3.4\mock_data.json"]
type => "json"
start_position => "beginning"
sincedb_path => "/dev/null"
}
}
filter {
grok {
match => [ 'message', '(?"id":.*"longitude":"[^"]+")' ]
add_field => [ "json_body", "{%{body}}" ]
}
json {
source => "json_body"
remove_field => ["message","body","json_body" ]
}
mutate {
add_field => ["[geoip][location]","%{[latitude]}"]
add_field => ["[geoip][location]","%{[longitude]}"]
}
mutate {
convert => [ "[geoip][location]", "float"]
}
}

output {
stdout {
codec => rubydebug
}
elasticsearch {
hosts => "127.0.0.1"
protocol => "http"

index => "logstash-json"

}
}

PMing people to get them to look at your problem is a good way to get them to ignore you. You should be patient.

The elasticsearch output plugin has changed since 1.x and now only supports HTTP protocol. Please look at the documentation for the particular version you are using to see what parameters are supported and what they are expected to look like.

If you are connecting to a local node on port 9200, you should not need to specify hosts and protocol (deprecated) as the default values should be fine.

It is plain that you're using an old configuration guide to create your Logstash configuration. Please verify version compatibility when using outdated guides from the internet.

  1. As has already been pointed out, agent is not necessary in the command line.

  2. sincedb_path => "/dev/null" This path does not exist in Windows, and is probably causing errors as a result.

  3. I do not see a body field being captured anywhere in this grok expression, so I'm not sure how it could be added to json_body in the add_field statement.

grok {
match => [ 'message', '(?\"id\":.*\"longitude\":\"[^"]+\")' ]
add_field => [ "json_body", "{%{body}}" ]
}
  1. Because of the json_body issue outlined above, how can mutate add fields that do not exist? Or convert fields that haven't been created?
  2. As pointed out already, protocol => "http" is not needed.

If you'd like some help getting something working here, please also attach a record sample from the file identified as C:\logstash-2.3.4\mock_data.json. We'll be much better able to see what you're trying to accomplish and help you to make it work.

I am not using agent command and protocol => "http"
C:\logstash-2.3.4\mock_data.json
mock_data.json file detail and this file having 1000 records in it.

[{"id":1,"first_name":"REDACTED","last_name":"REDACTED","date":"5/31/2014","email":"REDACTED@REDACTED.com","country":"France","city":"La Rochelle","latitude":"46.1667","longitude":"-1.15"},
{"id":2,"first_name":"REDACTED","last_name":"REDACTED","date":"1/7/2015","email":"REDACTED@REDACTED.tv","country":"Uruguay","city":"La Paloma","latitude":"-34.66268","longitude":"-54.16452"},
{"id":3,"first_name":"REDACTED","last_name":"REDACTED","date":"8/7/2014","email":"REDACTED@REDACTED.com","country":"China","city":"Dongmazar","latitude":"43.83139","longitude":"81.85056"},

EDIT: I redacted the personal email addresses.

You're going to have to work some magic on this file, then, as the JSON is in an array. That makes it a bit trickier to ingest via Logstash, as Logstash generally expects a single event per line.

I'm not sure Logstash can ingest a 1,000 line array as a single JSON object, before it might exhaust some buffer.

The main issue is that Pipeline started and after the few seconds Pipeline aborted
io/console not supported; tty will not be manipulated
The signal HUP is in use by the JVM and will not work correctly on this platform

Those are false positives. It may seem that those are the error, but those likely have nothing to do with the reason your configuration is not working, as they frequently appear anyway in windows.

I've taken the small array of sample data and tested locally for myself:

input {
  file {
    path => "/Users/buh/tmp/logstash-2.3.4/wintest.json"
    codec => json
    sincedb_path => "/dev/null"
    start_position => "beginning"
  }
}

output {
  stdout { codec => rubydebug }
}

And the output:

 bin/logstash -f wintest.conf --verbose
starting agent {:level=>:info}
starting pipeline {:id=>"main", :level=>:info}
Settings: Default pipeline workers: 8
Registering file input {:path=>["/Users/buh/tmp/logstash-2.3.4/wintest.json"], :level=>:info}
Starting pipeline {:id=>"main", :pipeline_workers=>8, :batch_size=>125, :batch_delay=>5, :max_inflight=>1000, :level=>:info}
Pipeline main started
{
            "id" => 1,
    "first_name" => "REDACTED",
     "last_name" => "REDACTED",
          "date" => "5/31/2014",
         "email" => "REDACTED@REDACTED.com",
       "country" => "France",
          "city" => "La Rochelle",
      "latitude" => "46.1667",
     "longitude" => "-1.15",
      "@version" => "1",
    "@timestamp" => "2016-08-08T19:09:15.511Z",
          "path" => "/Users/buh/tmp/logstash-2.3.4/wintest.json",
          "host" => "redacted"
}
{
            "id" => 2,
    "first_name" => "REDACTED",
     "last_name" => "REDACTED",
          "date" => "1/7/2015",
         "email" => "REDACTED@REDACTED.tv",
       "country" => "Uruguay",
          "city" => "La Paloma",
      "latitude" => "-34.66268",
     "longitude" => "-54.16452",
      "@version" => "1",
    "@timestamp" => "2016-08-08T19:09:15.512Z",
          "path" => "/Users/buh/tmp/logstash-2.3.4/wintest.json",
          "host" => "redacted"
}
{
            "id" => 3,
    "first_name" => "REDACTED",
     "last_name" => "REDACTED",
          "date" => "8/7/2014",
         "email" => "REDACTED@REDACTED.com",
       "country" => "China",
          "city" => "Dongmazar",
      "latitude" => "43.83139",
     "longitude" => "81.85056",
      "@version" => "1",
    "@timestamp" => "2016-08-08T19:09:15.512Z",
          "path" => "/Users/buh/tmp/logstash-2.3.4/wintest.json",
          "host" => "redacted"
}

And for /dev/null in windows, try using the file path NUL.

Now I can test your example for myself on windows,

input {
file {
path => "C:\logstash-2.3.4\test.json"
codec => json
sincedb_path => "NUL"
start_position => "beginning"
}
}

output {
stdout { codec => rubydebug }
}

C:\logstash-2.3.4>bin\logstash -f logstash.conf --verbose
io/console not supported; tty will not be manipulated
starting agent {:level=>:info}
starting pipeline {:id=>"main", :level=>:info}
Settings: Default pipeline workers: 4
Registering file input {:path=>["C:\logstash-2.3.4\test.json"], :level=>:info}
Starting pipeline {:id=>"main", :pipeline_workers=>4, :batch_size=>125, :batch_delay=>5, :max_inflight=>500, :level=>:info}
Pipeline main started
JSON parse failure. Falling back to plain-text {:error=>#<LogStash::Json::ParserError: Unexpected end-of-input within/between ARRAY entries
at [Source: [B@61811ede; line: 2, column: 186]>, :data=>"[{"id":1,"first_name":"Frank","last_name":"Mills","date":"5/31/2014","email":"fmills0@feedburner.com","country":"France","city":"La Rochelle","latitude":"46.1667","longitude":"-1.15"},\r", :level=>:info}
JSON parse failure. Falling back to plain-text {:error=>#<LogStash::Json::ParserError: Unexpected character (',' (code 44)): expected a valid value (number, String, array, object, 'true', 'false' or 'null')
at [Source: [B@52bc9b7a; line: 1, column: 188]>, :data=>"{"id":2,"first_name":"Barbara","last_name":"Torres","date":"1/7/2015","email":"btorres1@ustream.tv","country":"Uruguay","city":"La Paloma","latitude":"-34.66268","longitude":"-54.16452"},\r", :level=>:info}
{
"message" => "[{"id":1,"first_name":"Frank","last_name":"Mills","date":"5/31/2014","email":"fmills0@feedburner.com","country":"France","city":"La Rochelle","latitude":"46.1667","longitude":"-1.15"},\r",
"tags" => [
[0] "_jsonparsefailure"
],
"@version" => "1",
"@timestamp" => "2016-08-08T20:51:27.241Z",
"path" => "C:\logstash-2.3.4\test.json",
"host" => "SONY-VAIO"
}
{
"message" => "{"id":2,"first_name":"Barbara","last_name":"Torres","date":"1/7/2015","email":"btorres1@ustream.tv","country":"Uruguay","city":"La Paloma","latitude":"-34.66268","longitude":"-54.16452"},\r",
"tags" => [
[0] "_jsonparsefailure"
],
"@version" => "1",
"@timestamp" => "2016-08-08T20:51:27.271Z",
"path" => "C:\logstash-2.3.4\test.json",
"host" => "SONY-VAIO"
}

Each of the \r characters between JSON lines is causing the error, as this is not valid JSON. The example I used and cut/pasted does not have these.

This is what I'm talking about, the Unexpected end-of-input within/between ARRAY entries is the \r

1 Like