Not able to feed log file to Elasticsearch through Logstash

I am trying to feed a log file through logstash into elasticsearch. My logstash config file is:

input {
  file {
    path => "/root/Desktop/Graylog/SB_log.txt"
    start_position => "beginning"
    type => "logs"
    sincedb_path => "/dev/null"
  }
}
filter {
  if [type] == "logs" {
    mutate {
      add_field => {"message" => "%{Message}"}
   }
  }
}
output {
  elasticsearch {
    hosts => ["127.0.0.1"]
 }
}

Neither the output is working for stdout {} nor for gelf {}. And it also doesn't show any error.

On running Logstash i get the following and it does nothing after that..

[root@localhost logstash]# bin/logstash -f logstash-config.conf
Settings: Default pipeline workers: 4
Logstash startup completed

and it stays like this. Tried with stdout {} and am getting the same.

Kindly help me with its reason and solution.
Thanks.

As this is a Logstash question rather than an ES question I suggest you edit your post and change the category to Logstash.

Perhaps SB_log.txt is older than 24 hours and you need to adjust the file input's ignore_older option?