Arbitrary non standard log file. need help with grok pattern

Hi,
I have a unstructured log file which does not have a similar pattern as below

SOMEWORD: (@aaa.py:1222) Started test case 123 execution

  • ABC_TIME: started script aaa.py at [20151202 17:10:28]

SOMEWORD: (@abc.pd) Execution time:01:01:15

SOMEWORD: (@aaa.py:12) FAIL:0: Test case 123 executed successfully

I need to fetch the "test case 123", "[20151202 17:10:28]", "01:01:15" and "FAIL" from the above lines from a log file and store them in elastic search with one id.
I wrote a grok pattern as below, but it is storing each field as a different entry with different ID's in elastic search.

filter {
if [message] =~ /SOMEWORD: / and [message] =~ /aaa/ and [message] =~ /Started/ {
grok {
match => {"message" => "%{WORD}: (%{GREEDYDATA}) Started %{GREEDYDATA:Testname} %{WORD}"}
add_field => ["var","Testname"]
}
}
else if [message] =~ /SOMEWORD: / and [message] =~ /successfully/ and [message] =~ /Test case / {
grok {
match => {"message" => "%{WORD}: (%{GREEDYDATA}) %{WORD:Status}:%{GREEDYDATA}"}
add_field => ["var","Status"]
}

}
else if [message] =~ /ABC_TIME: / and [message] =~ /aaa/ {
grok {
match => {"message" => "* ABC_TIME: %{WORD} %{WORD} %{WORD}.%{WORD} at [%{YEAR}%{MONTHNUM}%{MONTHDAY} %{TIME:start_time}]"}
add_field => ["var","start_time"] }
}
else if [message] =~ /Execution time:/ {
grok {
match => { "message" => "REPORT:(%{GREEDYDATA}) Execution time:%{TIME:execution_time}"}
add_field => ["var","execution_time"]}
}
else
{
drop {}
}

}

output {
stdout {}
elasticsearch {
hosts => "localhost:9200"
index => "test"
document_type=> "Test_log"
document_id => "%{var}"
}
}

Could someone please help ?

Have a look at the aggregate filter.

I agree with @magnusbaeck
If you have a start event and an end event, and you want to merge both into only one elasticsearch document, aggregate plugin is perfect for that.
There are samples on aggregate plugin documentation to help you configure.
And if you have questions about the plugin, don't hesitate to ask.