Problem loading CSV file using logstash

I am new into Logstash/elasticsearch/kibana

I have created the following logstash config in windows

input {
file {
path => "C:\Users\aperez.SEAPUB\Downloads\logstash-6.0.1\data\avrs_syslog"
start_position => "beginning"
sincedb_path => "NUL"]
}
}
filter {
csv {
separator => ";"
columns => [ "type", "routing" , "sysid", "date", "time" , "jobid" ,
"uexit" , "msgid", "msgtxt" ]
}
}
output {
elasticsearch {
hosts => "http://localhost:5601"
index => "zosindex"
document_type => "mylogs"
}
}

and when I executed via logstah I get the attached errors that I cannot decipher

Any helo would be greatly appreciated

Regards
Alfredo

C:\Users\aperez.SEAPUB\Downloads\logstash-6.0.1\bin>logstash -f C:\Users\aperez.
SEAPUB\Downloads\logstash-6.0.1\data\logstash.config2
Sending Logstash's logs to C:/Users/aperez.SEAPUB/Downloads/logstash-6.0.1/logs
which is now configured via log4j2.properties
[2017-12-12T18:03:46,383][INFO ][logstash.modules.scaffold] Initializing module
{:module_name=>"fb_apache", :directory=>"C:/Users/aperez.SEAPUB/Downloads/logsta
sh-6.0.1/modules/fb_apache/configuration"}
[2017-12-12T18:03:46,411][INFO ][logstash.modules.scaffold] Initializing module
{:module_name=>"netflow", :directory=>"C:/Users/aperez.SEAPUB/Downloads/logstash
-6.0.1/modules/netflow/configuration"}
[2017-12-12T18:03:46,496][WARN ][logstash.config.source.multilocal] Ignoring the
'pipelines.yml' file because modules or command line options are specified
[2017-12-12T18:03:46,939][ERROR][logstash.agent ] Failed to execute ac
tion {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"L
ogStash::ConfigurationError", :message=>"Expected one of #, {, } at line 5, colu
mn 23 (byte 150) after input {\n file {\n path => "C:\Users\aperez.SEAPUB
\Downloads\logstash-6.0.1\data\avrs_syslog"\n\tstart_position => "beginnin
g"\n\tsincedb_path => "NUL"", :backtrace=>["C:/Users/aperez.SEAPUB/Downloads/
logstash-6.0.1/logstash-core/lib/logstash/compiler.rb:42:in compile_ast'", "C:/ Users/aperez.SEAPUB/Downloads/logstash-6.0.1/logstash-core/lib/logstash/compiler .rb:50:incompile_imperative'", "C:/Users/aperez.SEAPUB/Downloads/logstash-6.0.
1/logstash-core/lib/logstash/compiler.rb:54:in compile_graph'", "C:/Users/apere z.SEAPUB/Downloads/logstash-6.0.1/logstash-core/lib/logstash/compiler.rb:12:in
block in compile_sources'", "org/jruby/RubyArray.java:2486:in map'", "C:/Users/ aperez.SEAPUB/Downloads/logstash-6.0.1/logstash-core/lib/logstash/compiler.rb:11 :incompile_sources'", "C:/Users/aperez.SEAPUB/Downloads/logstash-6.0.1/logstas
h-core/lib/logstash/pipeline.rb:107:in compile_lir'", "C:/Users/aperez.SEAPUB/D ownloads/logstash-6.0.1/logstash-core/lib/logstash/pipeline.rb:49:ininitialize
'", "C:/Users/aperez.SEAPUB/Downloads/logstash-6.0.1/logstash-core/lib/logstash/
pipeline.rb:215:in initialize'", "C:/Users/aperez.SEAPUB/Downloads/logstash-6.0 .1/logstash-core/lib/logstash/pipeline_action/create.rb:35:inexecute'", "C:/Us
ers/aperez.SEAPUB/Downloads/logstash-6.0.1/logstash-core/lib/logstash/agent.rb:3
35:in block in converge_state'", "C:/Users/aperez.SEAPUB/Downloads/logstash-6.0 .1/logstash-core/lib/logstash/agent.rb:141:inwith_pipelines'", "C:/Users/apere
z.SEAPUB/Downloads/logstash-6.0.1/logstash-core/lib/logstash/agent.rb:332:in bl ock in converge_state'", "org/jruby/RubyArray.java:1734:ineach'", "C:/Users/ap
erez.SEAPUB/Downloads/logstash-6.0.1/logstash-core/lib/logstash/agent.rb:319:in
converge_state'", "C:/Users/aperez.SEAPUB/Downloads/logstash-6.0.1/logstash-cor e/lib/logstash/agent.rb:166:inblock in converge_state_and_update'", "C:/Users/
aperez.SEAPUB/Downloads/logstash-6.0.1/logstash-core/lib/logstash/agent.rb:141:i
n with_pipelines'", "C:/Users/aperez.SEAPUB/Downloads/logstash-6.0.1/logstash-c ore/lib/logstash/agent.rb:164:inconverge_state_and_update'", "C:/Users/aperez.
SEAPUB/Downloads/logstash-6.0.1/logstash-core/lib/logstash/agent.rb:90:in execu te'", "C:/Users/aperez.SEAPUB/Downloads/logstash-6.0.1/logstash-core/lib/logstas h/runner.rb:362:inblock in execute'", "C:/Users/aperez.SEAPUB/Downloads/logsta
sh-6.0.1/vendor/bundle/jruby/2.3.0/gems/stud-0.0.23/lib/stud/task.rb:24:in `bloc
k in initialize'"]}

Like the output says: "Expected one of #, {, } at line 5, colu
mn 23 (byte 150)": there is a typo in your config and it's the ']' in 'sincedb_path => "NUL"]'

Thank Kurt...I am so tired trying to ge this work that I missed this. I am getting another error that I cannot troubleshoot. I think I will try GrayLog something that should be simple is getting so complicated

Here is the log and I really appreciate your help
ALfredo

C:\Users\aperez.SEAPUB\Downloads\logstash-6.0.1\bin>logstash -f C:\Users\aperez.
SEAPUB\Downloads\logstash-6.0.1\data\logstash.config2
Sending Logstash's logs to C:/Users/aperez.SEAPUB/Downloads/logstash-6.0.1/logs
which is now configured via log4j2.properties
[2017-12-12T21:09:19,399][INFO ][logstash.modules.scaffold] Initializing module
{:module_name=>"fb_apache", :directory=>"C:/Users/aperez.SEAPUB/Downloads/logsta
sh-6.0.1/modules/fb_apache/configuration"}
[2017-12-12T21:09:19,403][INFO ][logstash.modules.scaffold] Initializing module
{:module_name=>"netflow", :directory=>"C:/Users/aperez.SEAPUB/Downloads/logstash
-6.0.1/modules/netflow/configuration"}
[2017-12-12T21:09:19,492][WARN ][logstash.config.source.multilocal] Ignoring the
'pipelines.yml' file because modules or command line options are specified
[2017-12-12T21:09:20,470][INFO ][logstash.agent ] Successfully started
Logstash API endpoint {:port=>9600}
[2017-12-12T21:09:23,965][INFO ][logstash.outputs.elasticsearch] Elasticsearch p
ool URLs updated {:changes=>{:removed=>[], :added=>[http://localhost:5601/]}}
[2017-12-12T21:09:23,972][INFO ][logstash.outputs.elasticsearch] Running health
check to see if an Elasticsearch connection is working {:healthcheck_url=>http:/
/localhost:5601/, :path=>"/"}
[2017-12-12T21:09:24,265][WARN ][logstash.outputs.elasticsearch] Restored connec
tion to ES instance {:url=>"http://localhost:5601/"}
[2017-12-12T21:09:24,334][ERROR][logstash.pipeline ] Error registering pl
ugin {:pipeline_id=>"main", :plugin=>"#<LogStash::OutputDelegator:0x6a21796a @na
mespaced_metric=#<LogStash::Instrument::NamespacedMetric:0x651eb872 @metric=#<Lo
gStash::Instrument::Metric:0x3affb093 @collector=#<LogStash::Instrument::Collect
or:0x63db6a15 @agent=nil, @metric_store=#<LogStash::Instrument::MetricStore:0x2d
0a52d @store=#<Concurrent::map:0x00000000000fb0 entries=3 default_proc=nil>, @st
ructured_lookup_mutex=#Mutex:0x121d402d, @fast_lookup=#<Concurrent::map:0x0000
0000000fb4 entries=58 default_proc=nil>>>>, @namespace_name=[:stats, :pipelines,
:main, :plugins, :outputs, :cad5c2abcb6d3954ef380b8fd9109b31f84a03d4035f6a9129b
3706c81e954f9]>, @metric=#<LogStash::Instrument::NamespacedMetric:0x6d9242b6 @me
tric=#<LogStash::Instrument::Metric:0x3affb093 @collector=#<LogStash::Instrument
::Collector:0x63db6a15 @agent=nil, @metric_store=#<LogStash::Instrument::MetricS
tore:0x2d0a52d @store=#<Concurrent::map:0x00000000000fb0 entries=3 default_proc=
nil>, @structured_lookup_mutex=#Mutex:0x121d402d, @fast_lookup=#<Concurrent::M
ap:0x00000000000fb4 entries=58 default_proc=nil>>>>, @namespace_name=[:stats, :stuck_out_tongue:
ipelines, :main, :plugins, :outputs]>, @logger=#<LogStash::Logging::Logger:0x120
842ac @logger=#Java::OrgApacheLoggingLog4jCore::Logger:0x4ffa5691>, @out_count
er=org.jruby.proxy.org.logstash.instrument.metrics.counter.LongCounter$Proxy2 -
namespace: [stats, pipelines, main, plugins, outputs, cad5c2abcb6d3954ef380b8fd9
109b31f84a03d4035f6a9129b3706c81e954f9, events] key: out value:0, @strategy=#<Lo
gStash::OutputDelegatorStrategies::Shared:0x39b85485 @output=<LogStash::Outputs:
:ElasticSearch hosts=>[http://localhost:5601], index=>"zosindex", id=>"cad5c2
abcb6d3954ef380b8fd9109b31f84a03d4035f6a9129b3706c81e954f9", enable_metric=>tru
e, codec=><LogStash::Codecs::Plain id=>"plain_acf493b1-2c15-431e-9b94-4e74162ef
2e1", enable_metric=>true, charset=>"UTF-8">, workers=>1, manage_template=>tr
ue, template_name=>"logstash", template_overwrite=>false, doc_as_upsert=>false
, script_type=>"inline", script_lang=>"painless", script_var_name=>"event"
, scripted_upsert=>false, retry_initial_interval=>2, retry_max_interval=>64, ret
ry_on_conflict=>1, action=>"index", ssl_certificate_verification=>true, sniffi
ng=>false, sniffing_delay=>5, timeout=>60, pool_max=>1000, pool_max_per_route=>1
00, resurrect_delay=>5, validate_after_inactivity=>10000, http_compression=>fals
e>>, @in_counter=org.jruby.proxy.org.logstash.instrument.metrics.counter.LongCou
nter$Proxy2 - namespace: [stats, pipelines, main, plugins, outputs, cad5c2abcb6d
3954ef380b8fd9109b31f84a03d4035f6a9129b3706c81e954f9, events] key: in value:0, @
id="cad5c2abcb6d3954ef380b8fd9109b31f84a03d4035f6a9129b3706c81e954f9", @time_m
etric=org.jruby.proxy.org.logstash.instrument.metrics.counter.LongCounter$Proxy2

  • namespace: [stats, pipelines, main, plugins, outputs, cad5c2abcb6d3954ef380b8
    fd9109b31f84a03d4035f6a9129b3706c81e954f9, events] key: duration_in_millis value
    :0, @metric_events=#<LogStash::Instrument::NamespacedMetric:0x69756c3c @metric=#
    <LogStash::Instrument::Metric:0x3affb093 @collector=#<LogStash::Instrument::Coll
    ector:0x63db6a15 @agent=nil, @metric_store=#<LogStash::Instrument::MetricStore:0
    x2d0a52d @store=#<Concurrent::map:0x00000000000fb0 entries=3 default_proc=nil>,
    @structured_lookup_mutex=#Mutex:0x121d402d, @fast_lookup=#<Concurrent::map:0x0
    0000000000fb4 entries=58 default_proc=nil>>>>, @namespace_name=[:stats, :pipelin
    es, :main, :plugins, :outputs, :cad5c2abcb6d3954ef380b8fd9109b31f84a03d4035f6a91
    29b3706c81e954f9, :events]>, @output_class=LogStash::Outputs::ElasticSearch>", :
    error=>"Unexpected character ('<' (code 60)): expected a valid value (number, St
    ring, array, object, 'true', 'false' or 'null')\n at [Source: (byte[])""; line: 1, column: 2]
    ", :thread=>"#<Thread:0x55544699@C:/Users/aperez.SEAPUB/Downloads/logstash-6.0.1
    /logstash-core/lib/logstash/pipeline.rb:290 run>"}
    [2017-12-12T21:09:24,354][ERROR][logstash.pipeline ] Pipeline aborted due
    to error {:pipeline_id=>"main", :exception=>#<LogStash::Json::ParserError: Unex
    pected character ('<' (code 60)): expected a valid value (number, String, array,
    object, 'true', 'false' or 'null')
    at [Source: (byte[])""; line: 1, column: 2]>, :backtrace=>["C:/Users/aperez.SEAPUB/Download
    s/logstash-6.0.1/logstash-core/lib/logstash/json.rb:41:in `jruby_load'", "C:/Use

I guess that's the next error in your config. Port 5601 is the default for Kibana and you want to send them to elasticsearch which is default to be found on port 9200.

1 Like

Kurt...you are so Great!!! Many thanks for your help. I was able to load the data. I have other errors but I will try to learn and find teh solution. I will post another msg if I need help

Thanks again for your kindness and responses
Alfredo

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.