Hi All,
I have been trying to upload csv data using logstash, but was unable to index the data and getting error like characterset miss match,
I was able to identify these warnings are occurring only for the field values where empty spaces are there (for example if there is a ticket is not resolved then those "Resolved_Date" column will be empty)
Here is the error
[2018-11-13T14:13:25,886][WARN ][logstash.codecs.plain    ] Received an event that has a different character encoding than you configured. {:text=>"INC0230207,UA-SAP-UK,10-25-18 15:38,SAP workflow in transaction ZMMBV duplicates purchaseorder workflow item,TCS-SAP-DC,Self-service,FALSE,Ren\\x82 Wijnenga,Low,,,Assigned,a306815,,,\\r", :expected_charset=>"UTF-8"}
[2018-11-13T14:13:26,062][WARN ][logstash.filters.csv     ] Error parsing csv {:field=>"message", :source=>"INC0230914,UA-SAP-UK,10-26-18 14:39,SAP password reset / account unlock SAP-ID: B015060,SD-INCIDENTS,Email,FALSE,Simon Edwards,Support,10-26-18 14:42,,Resolved,a323921,,Password reset / Unlocked Account,\"Password reset done.", :exception=>#<CSV::MalformedCSVError: Unclosed quoted field on line 1.>}
Here is my config file:
input {
  file {
    path => "/opt/installables/csv/Book1.csv"
    start_position => "beginning"
    sincedb_path => "/dev/null"
 }
}
filter {
  csv {
      separator => ","
      columns => ["Number","Configuration_item","Opened","Short_description","Assignment_group","Contact_type","Major_incident","Caller","Priority","Resolved","Closed","Status","Updated_by","Closed_by","Category","Close_notes"]
  }
  }
output {
#   elasticsearch {
#     hosts => "1.12.1.3:9200"
#     index => "incanalysis"
#  }
  stdout {
    codec => rubydebug
  }
}
Any suggestions where i'm doing wrong
Thanks
Gauti