Logstash error:no implicit conversion of fixnum into string

hi all
I am using file beat, logstash , elasticsearch and kibana. my filebeat ships logs into logstash, but following error can be found in logstash log:

[2019-01-21T12:30:54,588][ERROR][logstash.pipeline        ] Error registering plugin {:pipeline_id=>"main", :plugin=>"#<LogStash::OutputDelegator:0x31d7e2ea>", :error=>"no implicit conversion of Fixnum into String", :thread=>"#<Thread:0x5ab08bfa run>"}
[2019-01-21T12:30:54,603][ERROR][logstash.pipeline        ] Pipeline aborted due to error {:pipeline_id=>"main", :exception=>#<TypeError: no implicit conversion of Fixnum into String>, :backtrace=>["org/jruby/ext/cgi/escape/CGIEscape.java:387:in `escape'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-output-elasticsearch-9.2.1-java/lib/logstash/outputs/elasticsearch/http_client_builder.rb:150:in `setup_basic_auth'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-output-elasticsearch-9.2.1-java/lib/logstash/outputs/elasticsearch/http_client_builder.rb:58:in `build'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-output-elasticsearch-9.2.1-java/lib/logstash/outputs/elasticsearch.rb:234:in `build_client'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-output-elasticsearch-9.2.1-java/lib/logstash/outputs/elasticsearch/common.rb:25:in `register'", "org/logstash/config/ir/compiler/OutputStrategyExt.java:102:in `register'", "org/logstash/config/ir/compiler/AbstractOutputDelegatorExt.java:46:in `register'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:242:in `register_plugin'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:253:in `block in register_plugins'", "org/jruby/RubyArray.java:1734:in `each'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:253:in `register_plugins'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:594:in `maybe_setup_out_plugins'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:263:in `start_workers'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:200:in `run'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:160:in `block in start'"], :thread=>"#<Thread:0x5ab08bfa run>"}
[2019-01-21T12:30:54,656][ERROR][logstash.agent           ] Failed to execute action {:id=>:main, :action_type=>LogStash::ConvergeResult::FailedAction, :message=>"Could not execute action: PipelineAction::Create<main>, action_result: false", :backtrace=>nil}

my configuration file is as following:

input {
  beats {
    port => 5044
  }
}

output {
  elasticsearch {
    hosts => ["http://192.168.170.156:9200"]
    index => "mylogstash-%{+YYYY.MM.dd}"
    user => logstash_user
    password => 123456789
  }
}

also i defined logstash_user in kibana console as following:

 POST /_xpack/security/user/logstash_user
{
  "password" : "123456789",
  "roles" : [ "logstash_writer","logstash_system"],
  "full_name" : "Internal logstash User"
}

and

POST _xpack/security/role/logstash_writer
{
"cluster": ["manage_index_templates", "monitor"],
"indices": [
{
"names": ["mylogstash-*"],
"privileges": ["read","write","create_index"]
}
]
}

logstash cannot ship logs to elasticsearch and its log shows some errors.
any advice will be so appreciated.

There should be quotes around these

    user => "logstash_user"
    password => "123456789"

thanks. the user has been modified as you said. now, there is no error in logstash log but still i cannot find the index in kibana console

hi,
i change my logstash config as following

input {
  file {
    path => "/home/srahimi/files/15.txt"
  }
}

output {
  elasticsearch {
    hosts => ["http://192.168.170.156:9200"]
    index => "logstash-%{+YYYY.MM.dd}"
    user => "logstash_user"
    password => "123456789"
  }
}

and the logstash log is as following:

[2019-01-22T09:40:18,433][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"6.5.1"}
[2019-01-22T09:40:24,858][INFO ][logstash.pipeline        ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50}
[2019-01-22T09:40:25,703][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://logstash_user:xxxxxx@192.168.170.156:9200/]}}
[2019-01-22T09:40:25,737][INFO ][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>http://logstash_user:xxxxxx@192.168.170.156:9200/, :path=>"/"}
[2019-01-22T09:40:26,426][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://logstash_user:xxxxxx@192.168.170.156:9200/"}
[2019-01-22T09:40:26,549][INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>6}
[2019-01-22T09:40:26,555][WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>6}
[2019-01-22T09:40:26,617][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["http://192.168.170.156:9200"]}
[2019-01-22T09:40:26,684][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil}
[2019-01-22T09:40:26,735][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-*", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"_default_"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
[2019-01-22T09:40:27,057][INFO ][logstash.inputs.file     ] No sincedb_path set, generating one based on the "path" setting {:sincedb_path=>"/var/lib/logstash/plugins/inputs/file/.sincedb_be7a90688df39321b6df22223ef5a7f1", :path=>["/home/srahimi/files/15.txt"]}
[2019-01-22T09:40:27,126][INFO ][logstash.pipeline        ] Pipeline started successfully {:pipeline_id=>"main", :thread=>"#<Thread:0x1314fea1 run>"}
[2019-01-22T09:40:27,272][INFO ][logstash.agent           ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[2019-01-22T09:40:27,301][INFO ][filewatch.observingtail  ] START, creating Discoverer, Watch with file and sincedb collections
[2019-01-22T09:40:27,900][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}

it seems there is no problem, but i cannot find any "logstash" index in kibana. Any advice will be so approciated

Did you update the logstash_writer role to have access to logstash-* rather than mylogstash-*?

yes i do it, but there is no logstash index in kibana

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.