Logstash stopps processing at midnight UTC

Hi Team

we have a logstash configuration which process and push's the logs to our ES and kibana. At midnight UTC it stops processing the data . Here is our configuration

input {
file {
path => "/var/log/orglogs//M.log"
ignore_older => 0
start_position => "end"
}
}
filter {

   grok{
      match => { "path" => "%{GREEDYDATA}/%{GREEDYDATA:myDate}/%{GREEDYDATA:stsn}.log" }
   }


   grok {
          match => { "message" => "%{DATA:logDate} %{NUMBER:logLineId} %{TIME:myTime} streamer\(%{NUMBER:unUsed2}\) Warning: TCurlMultiDocument: Request of 'http://%{DATA:streamingDomain}/Content/HLS/Live/Channel\(%{DATA:channelName}\)/Stream\(%{DATA:streamProfileNo}\)/Segment\(%{DATA:streamSegmentId}\)/segment.ts' failed with HTTP code %{DATA:httpErrorCode}, cURL code %{DATA:unUsed3}, HTTP response code said error" }
        remove_tag   => ["_grokparsefailure"]
        add_tag         => [ "st-error-log" ]
        add_field       => { "indexOrSegmentFile"=>"segment.ts"}
        add_field       => { "type" => "st-error-log"}
        add_field       => { "documentId" => "%{stsn}-%{logLineId}-%{myDate}T%{myTime}"}
   }

if "_grokparsefailure" in [tags] {
grok {
match => { "message" => "%{DATA:logDate} %{NUMBER:logLineId} %{TIME:myTime} streamer(%{NUMBER:unUsed2}) Warning: TCurlMultiDocument: Request of 'http://%{DATA:streamingDomain}/Content/HLS/Live/Channel(%{DATA:channelName})/Stream(%{DATA:streamProfileNo})/index.m3u8' failed with HTTP code %{DATA:httpErrorCode}, cURL code %{DATA:unUsed3}, HTTP response code said error" }
remove_tag => ["_grokparsefailure"]
add_tag => [ "st-error-log" ]
add_field => { "indexOrSegmentFile"=>"index.m3u8"}
add_field => { "type" => "st-error-log"}
add_field => { "documentId" => "%{stsn}-%{logLineId}-%{myDate}T%{myTime}"}
}
}
if "_grokparsefailure" in [tags] {
grok {
match => { "message" => "%{DATA:logDate} %{NUMBER:logLineId} %{TIME:myTime} sysman(%{NUMBER:unUsed2}) Note: Platform started" }
remove_tag => ["_grokparsefailure"]
add_tag => [ "st-restart-log" ]
add_field => { "type" => "st-restart-initiated"}
add_field => { "documentId" => "%{stsn}-%{logLineId}-%{myDate}T%{myTime}"}
}
}

date {
match => [logDate,"yyyy-MM-dd HH:mm:ss"]
target => "@timestamp"
}
}
output
{
if "_grokparsefailure" in [tags] {
stdout { codec => "dots" }
}else {
tcp {
host => "our.es"
port => 7076
codec => json_lines
}
stdout{ codec => rubydebug }
}
}

at every day mid night UTC it completely stops processing the data . Need to run the logstash configuration again to start with the latest data. This started to happen since we added the new field logDate and a date filter for the same.
our input folder structure is as follows,
/var/logs/orglogs/ inside this folder we have folder for each day like 2016-1-01 to ... 2016-7-11. Logdate field is in-fact extracted from each log line .

Any help will be highly appreciated. Thanks in advance.

Thanks
Govind

Also to add one more findings after the midnight logstash is processing the data again which already processed instead of moving on to the new files. Can some one help us with this.

Thanks
Govind