Logstash - csv parsing failure

Hi Team,

I have below data in a csv file which has two columns only. As you see, the content in msg columns actually are a few of key-vlaue pairs, and I want logstash to parse these key-value pairs as well.
indent preformatted text by 4 spaces

@timestamp,msg
"August 15th 2019, 06:41:24.789","module=SCM fa=TS at=SCM.TS.SEARCH si=4C3D8709E51DDC4EE879A9E30729B512.mo-5692ea7ca ci=SCMStella cn=SCMStella cs=qacandrot_SCMStella. pi=dbPool1 ui=cgrant1 locale=en_US ktf1=[C,E,X,H,M] bf1=false bf2=false ktf2=Keyword if1=0"
"August 15th 2019, 06:41:20.318","module=SCM fa=TS at=SCM.TS.MODIFY_SEARCH si=4C3D8709E51DDC4EE879A9E30729B512.mo-5692ea7ca ci=SCMStella cn=SCMStella cs=qacandrot_SCMStella. pi=dbPool1 ui=cgrant1 locale=en_US ktf1=[C,E,X,H,M]".

Below is the filter plug-in part
indent preformatted text by 4 spaces

filter { 
          csv { columns => [ "@timestamp",
                             "msg"]
              separator => ","
              skip_header => "true"
			        } 

          if  [field] == "msg" {

            kv {
                field_split => " "
                value_split => "="
                trim_key => " "
                trim_value => " "
                include_brackets => false
               
             }
          }
           
}

But after I run command logstash -f ../config/logstash.conf, I got below error.
indent preformatted text by 4 spaces

[2019-08-15T16:41:32,040][DEBUG][logstash.filters.csv     ] Running csv filter {:event=>#<LogStash::Event:0x9edd36>}
[2019-08-15T16:41:32,169][DEBUG][logstash.filters.csv     ] Running csv filter {:event=>#<LogStash::Event:0x6c264f05>}
[2019-08-15T16:41:32,199][WARN ][logstash.filters.csv     ] Error parsing csv {:field=>"message", :source=>"\"August 15th 2019, 06:41:24.789\",\"module=SCM fa=TS at=SCM.TS.SEARCH si=4C3D8709E51DDC4EE879A9E30729B512.mo-5692ea7ca ci=SCMStella cn=SCMStella cs=qacandrot_SCMStella. pi=dbPool1 ui=cgrant1 locale=en_US ktf1=[C,E,X,H,M] bf1=false bf2=false ktf2=Keyword if1=0\"\r", :exception=>#<TypeError: wrong argument type String (expected LogStash::Timestamp)>}
[2019-08-15T16:41:32,202][DEBUG][logstash.filters.csv     ] Running csv filter {:event=>#<LogStash::Event:0xe366c5a>}
[2019-08-15T16:41:32,206][WARN ][logstash.filters.csv     ] Error parsing csv {:field=>"message", :source=>"\"August 15th 2019, 06:41:20.318\",\"module=SCM fa=TS at=SCM.TS.MODIFY_SEARCH si=4C3D8709E51DDC4EE879A9E30729B512.mo-5692ea7ca ci=SCMStella cn=SCMStella cs=qacandrot_SCMStella. pi=dbPool1 ui=cgrant1 locale=en_US ktf1=[C,E,X,H,M]\"\r", :exception=>#<TypeError: wrong argument type String (expected LogStash::Timestamp)>}
C:/elkstack/logstash-7.0.1/logstash-7.0.1/vendor/bundle/jruby/2.5.0/gems/awesome_print-1.7.0/lib/awesome_print/formatters/base_formatter.rb:31: warning: constant ::Fixnum is deprecated

What is the issue here?

This is your problem.
@timestamp is of type Timestamp. You need to first ingest the first column as your csv as a string and then use the date filter to parse it into @timestamp.

Like this:
Note: This may not work as I'm not sure if the "th" in the date will translate properly. If it does not, then you'll need a GROK filter first.

csv { columns => [ "string_timestamp",
                  "msg"]
   separator => ","
   skip_header => "true"
        } 
date {
	target => "@timestamp"
	match => ["string_timestamp","MMMMM ddth yyyy, HH:mm:ss.SSS"]
	#August 15th 2019, 06:41:24.789
}

You can use mutate+gsub to remove the ordinal indicator.

mutate { gsub => [ "timestamp", "(\d)(st|nd|rd|th) ", "\1 " ] }