Despite of change in Lostash error pointing to same place to look into

Today I started learning about ES and LS using latest versions.
Like any other curious newbie, experienced various issues and solved them one by one. I am hoping i.e. below mentioned problem will the last one before I will create my first pipeline.

Below is the config file I am using, trying to pass csv file has a mix of integers, date, text(lot of, which include /", ' )

Help:- Error pointing at row # 23 column 37. I am assuming it referring to config file, Despite of many changes I am made to config file like pass the less column names or use autodetec_column_names . Still its pointing to same place.

Please review both stuffs and suggest me what I am missing and if my interpretation from logs is not correct help me to understand it better.

FYI.. I started programming after 14 years and i am enjoying it.. :slight_smile:

// My logstacsh.config

input {
  file {
    path => "C:\Users\Documents\Chander\Elastic\Data\May_Dec 17.csv"
    start_position => "beginning"
    sincedb_path => "/dev/null"
  }
}

filter{

   csv {
 	separator => ","
#	skip_empty_columns => true
columns => [ "Month", "Quarter", "Year", "INCIDENT_ID", "REQ_ID", "COUNTRY", "SERVICE", "ASSIGNED_GROUP", "STATUS","DUPLICATE_CALL_FLAG", "ASSIGNEE_LOGIN_ID", "LAST_MODIFIED_BY" ] **(Due to limitation of words removed few column names)**
    }
	
  	mutate {convert => ["REQ_ID", "integer"]}
  	mutate {convert => ["ASSIGNEE_LOGIN_ID", "integer"]}
  	mutate {convert => ["SLA_RESUME_MIN", "integer"]}
	mutate {gsub => ["RESOLUTION", "['"\\]","0"]
	mutate {gsub => ["SUMMARY", "['"\\]","0"]
}

output{
  elasticsearch {
    hosts => "localhost"
    index => "reports"
    document_type => "Inc details"
  }
  stdout{}
}

// Error I have been experiencing.

  [2018-02-25T21:50:03,909][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"6.2.1"}
[2018-02-25T21:50:04,293][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}
[2018-02-25T21:50:04,622][ERROR][logstash.agent           ] Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"LogStash::ConfigurationError", :message=>"Expected one of #, {, ,, ] at line 23, column 37 (byte 1933) after filter{\n\n   csv {\n \tseparator => \",\"\n#\tskip_empty_columns => true\n#\tcolumns => [ \"Month\", \"Quarter\", \"Year\", \"INCIDENT_ID\", \"REQ_ID\", \"COUNTRY\", \"SERVICE\", \"ASSIGNED_GROUP\", \"STATUS\", \"STATUS_REASON\", \"SERVICE_TYPE\", #\"PRIORITY\", \"URGENCY\", \"IMPACT\", \"ASSIGNED_SUPPORT_ORGANIZATION\", \"ASSIGNED_SUPPORT_COMPANY\", #\"PIN\", \"FIRST_NAME\", \"LAST_NAME\", \"INTERNET_E_MAIL\", \"VIP\", \"CONTACT_SENSITIVITY\", \"ASSIGNEE\", \"SUBMITTER\", \"OWNER\", \"OWNER_SUPPORT_COMPANY\", #\"OWNER_GROUP\", \"OWNER_SUPPORT_ORGANIZATION\", \"DIRECT_CONTACT_COMPANY\", \"RESOLUTION\", \"RESOLUTION_CATEGORY\", \"RESOLUTION_CATEGORY_TIER_2\"  , #\"RESOLUTION_CATEGORY_TIER_3\", \"CLOSURE_PRODUCT_CATEGORY_TIER1\", \"CLOSURE_PRODUCT_CATEGORY_TIER2\", \"CLOSURE_PRODUCT_CATEGORY_TIER3\", \"SLA_RESUME_MIN\", #\"SLA_GOAL\", \"INC_SLA\", \"SLA_OVERALLSTARTTIME\", \"SLA_OVERALLSTOPTIME\", \"DUPLICATE_CALL_FLAG\", \"ASSIGNEE_LOGIN_ID\", \"LAST_MODIFIED_BY\" ]\n    }\n\tmutate {convert => [\"Month\", \"integer\"]}\n\tmutate {convert => [\"Quarter\", \"integer\"]}\n  \tmutate {convert => [\"Year\", \"integer\"]}\n  \tmutate {convert => [\"INCIDENT_ID\", \"integer\"]}\n  \tmutate {convert => [\"REQ_ID\", \"integer\"]}\n  \tmutate {convert => [\"ASSIGNEE_LOGIN_ID\", \"integer\"]}\n  \tmutate {convert => [\"SLA_RESUME_MIN\", \"integer\"]}\n\tmutate {gsub => [\"RESOLUTION\", \"['\"", :backtrace=>["C:/Users/1480587/Documents/Chander/Elastic/logstash-6.2.1/logstash-core/lib/logstash/compiler.rb:42:in `compile_imperative'", "C:/Users/1480587/Documents/Chander/Elastic/logstash-6.2.1/logstash-core/lib/logstash/compiler.rb:50:in `compile_graph'", "C:/Users/1480587/Documents/Chander/Elastic/logstash-6.2.1/logstash-core/lib/logstash/compiler.rb:12:in `block in compile_sources'", "org/jruby/RubyArray.java:2486:in `map'", "C:/Users/1480587/Documents/Chander/Elastic/logstash-6.2.1/logstash-core/lib/logstash/compiler.rb:11:in `compile_sources'", "C:/Users/1480587/Documents/Chander/Elastic/logstash-6.2.1/logstash-core/lib/logstash/pipeline.rb:51:in `initialize'", "C:/Users/1480587/Documents/Chander/Elastic/logstash-6.2.1/logstash-core/lib/logstash/pipeline.rb:169:in `initialize'", "C:/Users/1480587/Documents/Chander/Elastic/logstash-6.2.1/logstash-core/lib/logstash/pipeline_action/create.rb:40:in `execute'", "C:/Users/1480587/Documents/Chander/Elastic/logstash-6.2.1/logstash-core/lib/logstash/agent.rb:315:in `block in converge_state'", "C:/Users/1480587/Documents/Chander/Elastic/logstash-6.2.1/logstash-core/lib/logstash/agent.rb:141:in `with_pipelines'", "C:/Users/1480587/Documents/Chander/Elastic/logstash-6.2.1/logstash-core/lib/logstash/agent.rb:312:in `block in converge_state'", "org/jruby/RubyArray.java:1734:in `each'", "C:/Users/1480587/Documents/Chander/Elastic/logstash-6.2.1/logstash-core/lib/logstash/agent.rb:299:in `converge_state'", "C:/Users/1480587/Documents/Chander/Elastic/logstash-6.2.1/logstash-core/lib/logstash/agent.rb:166:in `block in converge_state_and_update'", "C:/Users/1480587/Documents/Chander/Elastic/logstash-6.2.1/logstash-core/lib/logstash/agent.rb:141:in `with_pipelines'", "C:/Users/1480587/Documents/Chander/Elastic/logstash-6.2.1/logstash-core/lib/logstash/agent.rb:164:in `converge_state_and_update'", "C:/Users/1480587/Documents/Chander/Elastic/logstash-6.2.1/logstash-core/lib/logstash/agent.rb:90:in `execute'", "C:/Users/1480587/Documents/Chander/Elastic/logstash-6.2.1/logstash-core/lib/logstash/runner.rb:348:in `block in execute'", "C:/Users/1480587/Documents/Chander/Elastic/logstash-6.2.1/vendor/bundle/jruby/2.3.0/gems/stud-0.0.23/lib/stud/task.rb:24:in `block in initialize'"]}

As you appear to use Windows, I believe this should be sincedb_path => "nul".

You are missing closing braces here.

The error message points out which line the error occur on, so read carefully and check your config.

Thanks you for your response. I have created the pipeline successfully, though I have encountered with few errors (refer below) based on my analysis, error occured because records either have " or '.

Q:- Does that mean below "mutant" statement wasn't worked appropriately.
Q:- I have processed 2000 + records and received an error for few, how i know for which record is not captured correctly ? Is there a boolean return code for success and failure.

Please refer any material which can help me to know in detail about it.

mutate {gsub => ["RESOLUTION", "['"\\]","0"]}
	mutate {gsub => ["SUMMARY", "['"\\]","0"]}

// My error logs

[2018-02-26T09:01:23,583][WARN ][logstash.filters.csv     ] Error parsing csv {:field=>"message", :source=>"", :exception=>#<NoMethodError: undefined method `each_index' for nil:NilClass>}
[2018-02-26T09:01:23,590][WARN ][logstash.filters.csv     ] Error parsing csv {:field=>"message", :source=>"Preventive Actions:N/A\"\tAPPLICATION\tALERTS\t\t\t\t\t6/21/1905 16:48\t48\tMet\t5/18/2017 1:38\t5/19/2017 10:58\t\t1561341\tAR_ESCALATOR\r", :exception=>#<CSV::MalformedCSVError: Illegal quoting in line 1.>}
[2018-02-26T09:01:23,590][WARN ][logstash.filters.csv     ] Error parsing csv {:field=>"message", :source=>"May\tQ2\t2017\tINC000004517330\tREQ000006303530\tMalaysia\tCEMS\tGBL-IS-COREBANKING-APP SUPPORT\tClosed\tAutomated Resolution Reported\tUser Service Restoration\tLow\t4-Low\t4-Minor/Localized\tPhone\t5/18/2017 1:48\t5/18/2017 1:49\t5/19/2017 10:09\t6/3/2017

Dear @magnusbaeck... need your assistance to understand the few concepts of logstash.

  • What does below error mean? as it has WARN and Error both.
  • How can I do the clean data ingestion in LS.
  • How can i update with new doc in the index / document_type.
  • Can I add more then one doc_type in an Index, if yes, can I make any relationship between doc_types.

Below one is resolved bt removing extra ".

mutate {gsub => ["RESOLUTION", "['"\\]","0"]}
	mutate {gsub => ["SUMMARY", "['"\\]","0"]}

Still i am experiencing the below [WARN] followed by error. Does it indicates that particular doc has an error while adding in Index OR it wasn't added at all.
When I search in discovery with IN8C000004 it is appearing.

[2018-02-26T13:29:16,167][WARN ][logstash.filters.csv     ] Error parsing csv {:field=>"message", :source=>"May\tQ2\t2017\tIN8C000004\tREQ000006986\tIndia\te\t5/8/2017 5:48\t5/8/2017 5:48\t5/12/2017 8:58\t5/27/2017 3:06\t5/27/2017 2:00\t5/8/2017 5:48\t\tAPPLICATION\8\t\t1276950\tAR_ESCALATOR\r", :exception=>#<CSV::MalformedCSVError: Illegal quoting in line 1.>}
[2018-02-26T13:29:16,164][WARN ][logstash.filters.csv     ] Error parsing csv {:field=>"message", :source=>"May\tQ2\t2017\tINC000004456792\tREQ000006185906\\ssue:ejected", :exception=>#<CSV::MalformedCSVError: Illegal quoting in line 1.>}

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.