Logstash + CSVs Data

Hello All,

Here is Elastic Search mapping,

"properties": {
"Runtime": {
"type": "date",
"format": "epoch_millis||MM/dd/yyyy HH:mm"
},
"Cstring1": {
"type": "string",
"index": "not_analyzed"
},
"Cstring2": {
"type": "string",
"index": "not_analyzed"
},
"Class1": {
"type": "integer"
},
"Class2": {
"type": "integer"
},
"CStatus": {
"type": "string",
"index": "not_analyzed"
}
}

CSVs Data will be really complicated,

02/06/2016 01:14:46,String1Data,String2Data1,1608,663,fail

Expected Output

"Runtime" : "02/06/2016 01:14:46",
"Cstring1" : "String1Data",
"Cstring2" : "String2Data1",
"Class1" : "1608",
"Class2" : "663",
"CStatus" : "fail"

02/06/2016 01:14:46,String1Data,String2Data1,String2Data2,String2Data3,String2Data4,700,63,fail

Expected Output

"Runtime" : "02/06/2016 01:14:46",
"Cstring1" : "String1Data",
"Cstring2" : "String2Data1,String2Data2,String2Data3,String2Data4",
"Class1" : "700",
"Class2" : "63",
"CStatus" : "fail"

02/06/2016 01:14:20,ERROR: Invalid entry,610,658

Expected Output

"Runtime" : "02/06/2016 01:14:20",
"Cstring1" : "ERROR: Invalid entry",
"Cstring2" : "N/A",
"Class1" : "610",
"Class2" : "658",
"CStatus" : "error"

how can i achieve output from these complicated data, if i use grok filter then what will be logic,

Note:

According to my logic,
If we separate the data
csv {
separator => ","
}

  1. We always have Runtime, Class1,Class2 into each logs,
  2. We can compare status at run time (Either log has pass or fail status, if not then we insert error status)
  3. Cstring1 data at fix location after Runtime into each block.

So, we can split the each log using , separator,
Event[0] will be Runtime,
Event[1] will be Cstring1,
Event[lenght-3] will be Class1,
Event[lenght-2] will be Class2,
Event[lenght-1] will be status (We need to check if pass or fail else import error status),

Thanks for your help in Advance,
SAM