Configure log file, into json format

an example of my current log files are:

Timestamp: 7/15/2016
Message: HandlingInstanceID: 12345
Type: ErrorMessage
MachineName: mymachine

is there anyway to configure my configuration file so that when using the json output, i would have 1 json object:
{
"Timestamp" : "7/15/2016",
"Message" : "HandlingInstanceID: 12345",
"Type" : "ErrorMessage",
"MachineName" : "mymachine"
}

@asoong-94 I am not sure why you are trying to convert log data in to json format. In Logstash by using grok filter you can match the patterns for your data. The output will be in a json format. Here, you can see how to use grok .

is it not true, that ElasticSearch prefers JSON??

it is also the case that my log files have about 15 lines per log, and coming up with patters for all of them seems a little redundant for me. So i figured that if i could get the log to be encoded into JSON i could just query it from elasticsearch

You need to use a multiline codec to merge the physical lines into a single logical event. Is the "Timestamp: " line the first line of each logical event you want to capture?

hello, thanks for your input. Ive played with the multiline filter, and got them to merge together into one json event. however now all the fields of my original error log are part of ONE GIANT field that logstash labels as "message" is there anyway to merge the events into 1 logical event while maintaining the separate fields?
{
"message" : "Timestamp" : "7/15/2016", Message :HandlingInstanceID: 12345, Type : ErrorMessage, MachineName :mymachine"
"@version":"1",
"@timestamp":"2016-08-01T15:59:10.202Z",
"host":"SOONGA-7W",
"tags":["multiline"]
}

The kv filter should be useful for splitting up the message field.