Logstash: Configuration for websphere

Following is my configuration file for logstash to read data from websphere logs.

input { 
      file {
        type => "bolbo"
        path => [ "D:/Users/abced/Documents/My Received Files/1003/1003/gca1.log" ]
        start_position => "beginning"
    }
}
filter {        
          
}
output {  
    elasticsearch {
        hosts => "localhost:9200"
        index => "webspherelogs"
    }
    stdout { }
}

My Log file data is as follows,

2016-03-11 06:36:48,845 [WebSphere_EJB_Timer_Service_WorkManager.Alarm Pool : 0] ERROR com.lord.mss.cddb.gca.service.export.ExportConsumerServiceImpl - 'Could not export changed entities, exportId: 424572'
com.lord.mss.cddb.gca.persistence.dao.DaoException: Io exception: Socket read timed out
	at com.lord.mss.cddb.gca.persistence.cdb.export.IdToExportFinder.getIdsToExport(IdToExportFinder.java:107)

It is reading the data and showing 4 rows for example, and also consist of only one column message

My question is,
1) How do I specify column names for my data here?
2) As data is multiline, how it can be configured to treat a new document which starts with date in above format?

  1. Use a grok filter.
  2. Use a multiline codec for your file input that treats lines that don't begin with a timestamp as belonging to the preceding line.

How to parse Java logs comes up here (and on other places like StackOverflow) rather frequently so you should be able to find something that's at least very close to what you need.