Problem with input file logstash version 1.5.3

Hello, I'm trying to take the logs from a file and send them to an elasticsearh instance in another machine, but I don't know why everytime I try it logstash service keep running without finishing the service and no new index is created in ES machine.

this is my logstash configuration:

input {
  file {
    path => ["/etc/elk/logout/test"]
    type => "file"
    start_position => "beginning"
  }
}
filter {

 if [type]== "file"{
       mutate {
             rename => ["@host", "host"]
        }
        dns {
             reverse => ["host"]
             action =>  "replace"
             nameserver => "IP_DNS_SERVER"
        }

       grok {
             patterns_dir => "/etc/logstash/patterns"
             match => [
                     "message","%{MESSAGE_1}",
                     "message", "%{MESSAGE_2}",
                    "message", "%{MESSAGE_3}",
                    "message", "%{MESSAGE_4}",
                    "message", "%{MESSAGE_5}"
                     ]
            }

         date{
           match => [ "dater", "YYYY/MM/dd HH:mm:ss.SSS" ]
           target => "@timestamp"
               }
}
}

output {

elasticsearch
{
    protocol => "http"
    cluster => "logstash"
     host=>"IP_B"
     index => "logstash-syslog-%{+YYYY.MM.dd}-fromFile"
    }
}

This is the format of text file content (it has been taken from another ES instance:

{"message":"2016/10/05 15:09:30.146 ...","@version":"1","@timestamp":"2016-10-05T13:09:30.205Z","type":"udp"...."} 
{"message...} 
...

everytime I try it logstash service keep running without finishing the service

Logstash doesn't shut down just because it reaches the end of the input file. It's designed to continuously monitor files and send data as it arrives.

and no new index is created in ES machine.

This is an extremely common problem that people are having. Please see my response here: Logstash not indexing the input data to elastic search