Logstash startup completed and then the system hangs logs are not passed into elastic search how can i overcome this?

My command prompt shows logstash startup completed i changed the filename and tried pushing the log again but that doesnt help how can i overcome this problem

Please provide more info, what versions are you on, what does your config look like, can you show us what happens when you startup LS.

logstash 2.1.1 ES 2.1.1 are the versions of Logstash and Elasticsearch,

input {
file {
path => "F:\wo\examples\10.log"

    start_position => "beginning"
    sincedb_path => "C:\Users\Akash katakam\null"

}

}
filter {
grok {
match => { "message" => "%{TOMCATLOG %{TOMCAT_DATESTAMP:timestamp} | %{LOGLEVEL:level} | %{JAVACLASS:class} - %{JAVALOGMESSAGE:logmessage}}"}
}
if "_grokparsefailure" in [tags] {
drop { }
}
grok {

  match => [ "message", "%{TOMCATLOG}", "message", "%{CATALINALOG}" ]
}
date {
  match => [ "timestamp", "yyyy-MM-dd HH:mm:ss,SSS Z", "MMM dd, yyyy HH:mm:ss a" ]
}

}

output {
elasticsearch {

index => "splog1"
}
stdout {}
}

The output you see is expected, LS has started and is waiting for input. The issue is likely sincedb, have you tried to delete the file it creates?

Try changing the input;

input {
  stdin {}
}

And then type F:\wo\examples\10.log > logstash -f ....

Using stdin is the best way to test things like this.

i deleted since_db files and restarted my logstash it is showing the same
error

It is not possibly an error.

In my case, when I use a CSV file to update data, For the first time when execute logstash command it executes and uploads successfully, but for the second and further time it doesn't do that.

So what I have figured out with this is, second or later time as long as there is no change in input(file/source) data is not uploaded. So keep console on and try somehow adding data into your log file, and suddenly data will be uploaded.