PipelineAction::Create/pipeline_id:main, :exception=>"LogStash::ConfigurationError", :message=>"Expected one of #, {, ,, ] at line 4, column 36 (byte 99)

Hi,

Can anyone please help with this issue, I am running CMD : :\Users\zondol\logstash-6.5.1\bin>logstash -f /Users/zondol/data/logstash.config

And below is my config file.

input {
file {
path => "C:\Users\zondol\datalog_file.csv"
start_position => ["beginning,"end"]
sincedb_path => "null"
}

}

filter {

    csv {

       seperator => ","

       columns => ["originId","userId ","queryType","neStream ","neManager ","neId","subscriberId ","subscriberIdType"]
    }

    mutate {convert => ["userId","Integer"]}

}

output {

elasticsearch{

 hosts => ["localhost:5601"]
 index => "logs"
 document_type => "output_imput_log"
}


stdout {}

}

start_position should be just "beginning" (not an array) and sincedb_path should as far as I know on Windows be "nul".

Thank you so much Christian :slight_smile: , that error is gone but I am now getting this one below:

Error registering plugin {:pipeline_id=>"main", :plugin=>"#LogStash::OutputDelegator:0x4b9fce4e", :error=>"Unexpected character ('<' (code 60)): expected a valid value (number, String, array, object, 'true', 'false' or 'null')\n a

Are you familiar with it?

Thanks in advance

That error I am not sure about, but this is also wrong. You should send the data to Elasticsearch (usually port 9200) and not Kibana.

Yeah , I have changed the port number and the error mesg changed to this:

logstash.pipeline ] Error registering plugin {:pipeline_id=>"main", :plugin=>"#<LogStash::FilterDelegator:0xf9b1246 @metric_events_out=org.jruby.proxy.org.logstash.instrument.metrics.counter.LongCounter$Proxy2 - name: out value:0, @metric_events_in=org.jruby.proxy.org.logstash.instrument.metrics.counter.LongCounter$Proxy2 - name: in value:0, @metric_events_time=org.jruby.proxy.org.logstash.instrument.metrics.counter.LongCounter$Proxy2 - name: duration_in_millis value:0, @id="42eb6a52cafe982c94f38926283166ed9cf0a6e3bc3729f313754a8b32718206", @klass=LogStash::Filters::Mutate, @metric_events=#LogStash::Instrument::NamespacedMetric:0x275a5554, @filter=<LogStash::Filters::Mutate convert=>{"userId"=>"Integer"}, id=>"42eb6a52cafe982c94f38926283166ed9cf0a6e3bc3729f313754a8b32718206", enable_metric=>true, periodic_flush=>false>>", :error=>"translation missing: en.logstash.agent.configuration.invalid_plugin_register", :thread=>"#<Thread:0x3778a666 run>"}

Your help is highly appreciated, I am very new to this.

Have a look at your mutate filter and check the convert directive against the example that should be in the docs.

Thanks, this is what I'm getting right now ..I am not seeing anything in the console though, doesn't this mean my data should be visible of Kibana front-end when this is done?

C:\Users\zondol\logstash-6.5.1\bin>logstash -f /Users/zondol/data/logstash.config
Sending Logstash logs to C:/Users/zondol/logstash-6.5.1/logs which is now configured via log4j2.properties
[2018-12-04T11:52:48,707][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2018-12-04T11:52:48,739][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"6.5.1"}
[2018-12-04T11:52:53,562][WARN ][logstash.outputs.elasticsearch] You are using a deprecated config setting "document_type" set in elasticsearch. Deprecated settings will continue to work, but are scheduled for removal from logstash in the future. Document types are being deprecated in Elasticsearch 6.0, and removed entirely in 7.0. You should avoid this feature If you have any questions about this, please visit the #logstash channel on freenode irc. {:name=>"document_type", :plugin=><LogStash::Outputs::ElasticSearch index=>"logs", id=>"9eb6298a3162a026999a6b9bf42eb7e070d89d073b5335cb5f6676c180f74d95", hosts=>[//localhost:9200], document_type=>"output_input_log", enable_metric=>true, codec=><LogStash::Codecs::Plain id=>"plain_5acfef7b-e479-4abc-9990-97d6db57c6c8", enable_metric=>true, charset=>"UTF-8">, workers=>1, manage_template=>true, template_name=>"logstash", template_overwrite=>false, doc_as_upsert=>false, script_type=>"inline", script_lang=>"painless", script_var_name=>"event", scripted_upsert=>false, retry_initial_interval=>2, retry_max_interval=>64, retry_on_conflict=>1, action=>"index", ssl_certificate_verification=>true, sniffing=>false, sniffing_delay=>5, timeout=>60, pool_max=>1000, pool_max_per_route=>100, resurrect_delay=>5, validate_after_inactivity=>10000, http_compression=>false>}
[2018-12-04T11:52:55,874][INFO ][logstash.pipeline ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50}
[2018-12-04T11:52:56,720][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://localhost:9200/]}}
[2018-12-04T11:52:56,743][INFO ][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>http://localhost:9200/, :path=>"/"}
[2018-12-04T11:52:57,068][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://localhost:9200/"}
[2018-12-04T11:52:57,172][INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>6}
[2018-12-04T11:52:57,182][WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: the type event field won't be used to determine the document _type {:es_version=>6}
[2018-12-04T11:52:57,230][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//localhost:9200"]}
[2018-12-04T11:52:57,266][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil}
[2018-12-04T11:52:57,303][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"default"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
[2018-12-04T11:52:58,124][INFO ][logstash.pipeline ] Pipeline started successfully {:pipeline_id=>"main", :thread=>"#<Thread:0x55843a7c run>"}
[2018-12-04T11:52:58,216][INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[2018-12-04T11:52:58,247][INFO ][filewatch.observingtail ] START, creating Discoverer, Watch with file and sincedb collections
[2018-12-04T11:52:58,782][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.