Logstash can't add fields?

I have been using logstash to read some DB restore logs. Here is some lines of sample records.

07/08/2016  6:33:22.50: START restore database						
SQL2540W  Restore is successful, however a warning "2539" was encountered 
during Database Restore while processing in No Interrupt mode.
07/08/2016  6:33:28.93: END restore database						
SQL4406W  The DB2 Administration Server was started successfully.
07/08/2016  6:35:35.29: END restart server							
connect reset
DB20000I  The SQL command completed successfully.
07/08/2016  6:35:38.48: END p:\s6\source\system\CMD\res_uw.cmd		

Here is the filter part of my conf file.

if ([message] =~ /Backup successful/){
	grok{
		match => {"message" => ['%{GREEDYDATA:Message}'] }
	}
	mutate {
		add_tag => "send_to_es"
		add_field => {"Timestamp" => "%{GREEDYDATA:DATETIME}"}
	}
}
if ([message] =~ /warning "2539"/){
	grok{
		match => {"message" => ['%{GREEDYDATA:Message}'] }
	}
	mutate {
		add_tag => "send_to_es"
		add_field => {"Timestamp" => "%{GREEDYDATA:DATETIME}"}
	}
}
if ([message] =~ /(END p:|END P:)/){
	grok{
		match => {"message" => ['%{GREEDYDATA:DATETIME}:%{SPACE}END%{SPACE}%{GREEDYDATA:Mis}'] }
		remove_field => "%{GREEDYDATA:Mis}"
	}
	mutate {
		add_tag => "send_to_es"
	}
}	

I want to add the data "DATETIME" extracted from the last line of my record to message to other message to index at the same time. However, it could not add the field successfully. The output will become

      "message": "SQL2540W  Restore is successful, however a warning \"2539\" was encountered \r\r",
      "@version": "1",
      "@timestamp": "2016-07-12T02:28:52.337Z",
      "path": "C:/CIGNA/hkiapp67_db_restore/res_uw.log",
      "host": "SIMSPad",
      "type": "txt",
      "Message": "SQL2540W  Restore is successful, however a warning \"2539\" was encountered \r\r",
      "Timestamp": "%{GREEDYDATA:DATETIME}",
      "tags": [
        "send_to_es"
      ]

How could I solve this?

  add_field => {"Timestamp" => "%{GREEDYDATA:DATETIME}"}

Grok syntax here doesn't work.

Furthermore, Logstash itself doesn't remember the contents of the last event and make it available when processing new events. You need to use a plugin like the aggregate filter to do that.

Sorry, I could not get it. As long as the messages could not share the same task_id, how can they map up?
I have changed my conf like the following.

if ([message] =~ /(END p:|END P:)/){
	grok{
		match => {"message" => ['%{GREEDYDATA:DATETIME}:%{SPACE}END%{SPACE}%{GREEDYDATA:Mis}'] }
		remove_field => "Mis"    		
	}
	mutate {
		add_tag => "send_to_es"
	}
	aggregate{
		task_id => "%{DATETIME}"
		code => "map['Timestamp'] = event['DATETIME']"
		map_action => "create"
	}
}	

But I do not understand how can I put the data into other events as other events do not contain a DATETIME field

It's not enough with one aggregate filter with map_action => "create". You need at least one more to finalize the aggregated event. Look closely at the examples in the aggregate filter documentation. They should be very close to what you want.