Logs are getting parsed multiple times due to ?[31mA plugin. can some one help me on this issue

I am newbie to elk .
currently,I am working with logstash-2.0.0-beta2 . whenever, I will run my logstash conf file .I am getting following errors.
1)io/console not supported; tty will not be manipulated
2)Error: No such file or directory - R:/dev/null.13908.9128.258181 {:level=>:error}?[0m
3) ?[31mA plugin had an unrecoverable error. Will restart this plugin.
my conf file:-
input {
file {
path => "C:/basefarm/logstash-2.0.0-beta2/logstash-2.0.0-beta2/g1a.conf"
start_position => beginning
sincedb_path => "/dev/null"

}
}
filter {
grok {
match => { "message" => "%{IP:client} %{WORD:method} %{URIPATHPARAM:request} %{NUMBER:bytes} %{NUMBER:duration}" }
}
}

output {
elasticsearch { hosts => "localhost"}
stdout { codec => rubydebug }
}

my logs:-
55.3.244.1 GET /index.html 15824 0.043

I am unable to find solution for these errors.please, help me.

1)io/console not supported; tty will not be manipulated

I believe you can ignore this.

2)Error: No such file or directory - R:/dev/null.13908.9128.258181 {:level=>:error}?[0m

"/dev/null" only works on non-Windows hosts. On Windows perhaps you can use "nul" instead.

Thank you Magnusbaeck for your quick response.

I have tried what you have suggested but I am getting the same error and I a using the following versions:

Software:

1)logstash-2.0.0-beta2
2)elasticsearch-2.0.0-beta2
3)kibana-4.2.0-beta1

my conf file:-

input {
file {
path => "C:/basefarm/logstash-2.0.0-beta2/logstash-2.0.0-beta2/g1a.conf"
start_position => beginning
sincedb_path => "nul"
}
}
filter {
grok {
match => { "message" => "%{IP:client} %{WORD:method} %{URIPATHPARAM:request} %{NUMBER:bytes} %{NUMBER:duration}" }
}
}
output {
elasticsearch { hosts => "localhost"}
stdout { codec => rubydebug }
}

My logs:-

55.3.244.1 GET /index.html 15824 0.043

ERROR:

←[32mWorker threads expected: 1, worker threads started: 1 {:level=>:info}←[0m
{
"message" => "55.3.244.1 GET /index.html 15824 0.043\r",
"@version" => "1",
"@timestamp" => "2015-10-30T07:31:53.458Z",
"host" => "Q07500-2K1",
"path" => "C:/basefarm/logstash-2.0.0-beta2/logstash-2.0.0-beta2/g1a.conf",
"client" => "55.3.244.1",
"method" => "GET",
"request" => "/index.html",
"bytes" => "15824",
"duration" => "0.043"
}
←[32mPipeline started {:level=>:info}←[0m
Logstash startup completed
{
"message" => "55.3.244.1 GET /index.html 15824 0.043\r",
"@version" => "1",
"@timestamp" => "2015-10-30T07:31:54.557Z",
"host" => "Q07500-2K1",
"path" => "C:/basefarm/logstash-2.0.0-beta2/logstash-2.0.0-beta2/g1a.conf",
"client" => "55.3.244.1",
"method" => "GET",
"request" => "/index.html",
"bytes" => "15824",
"duration" => "0.043"
}
503587847-325748-2621440 0 2 120
←[31mA plugin had an unrecoverable error. Will restart this plugin.
Plugin: <LogStash::Inputs::File path=>["C:/basefarm/logstash-2.0.0-beta2/logstash-2.0.0-beta2/g1a.conf"], start_position=>"beginning", sincedb_path=>"nul",
Error: No such file or directory - nul.13908.13824.841500 or nul {:level=>:error}←[0m
{
"message" => "55.3.244.1 GET /index.html 15824 0.043\r",
"@version" => "1",
"@timestamp" => "2015-10-30T07:32:01.348Z",
"host" => "Q07500-2K1",
"path" => "C:/basefarm/logstash-2.0.0-beta2/logstash-2.0.0-beta2/g1a.conf",
"client" => "55.3.244.1",
"method" => "GET",
"request" => "/index.html",
"bytes" => "15824",
"duration" => "0.043"
}

Okay, well don't set sincedb_path then or set it to a known fixed path or your choosing. If your goal indeed is to make Logstash process the file from the top every time you can delete the sincedb file before you run Logstash.

Hi Magnusbaeck
Thank you for your quick response.
I have worked what you have suggested but it is working by commenting the elasticsearch such as "#elasticsearch {hosts=>" localhost"}". if I uncomment the elasticsearch line I am getting error like

←[32mWorker threads expected: 1, worker threads started: 1 {:level=>:info}←[0m
←[32mAutomatic template management enabled {:manage_template=>"true", :level=>:info}←[0m
←[32mUsing mapping template {:template=>{"template"=>"logstash-", "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"default"=>{"_all"=>{"enabled"=>true, "omit_norms"=>true}, "dynamic_templates"=>[{"message_field"=>{"match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"
"string", "index"=>"analyzed", "omit_norms"=>true}}}, {"string_fields"=>{"match"=>"
", "match_mapping_type"=>"string", "mapping"=>{"type"=>"string", "index"=>"analyzed", "omit_norms"=>true, "fields"=>{"raw"=>{"type"=>"string", "index"=>"not_analyzed", "ignore_above"=>256}}}}}], "properties"=>{"@ve
ion"=>{"type"=>"string", "index"=>"not_analyzed"}, "geoip"=>{"type"=>"object", "dynamic"=>true, "properties"=>{"location"=>{"type"=>"geo_point"}}}}}}}, :level=>:info}←[0m
←[32mNew Elasticsearch output {:hosts=>["localhost"], :level=>:info}←[0m
{
"message" => "55.3.244.1 GET /index.html 15824 0.043\r",
"@version" => "1",
"@timestamp" => "2015-11-02T05:36:54.660Z",
"host" => "Q07500-2K1",
"path" => "C:/basefarm/logstash-2.0.0-beta2/logstash-2.0.0-beta2/g1a.conf",
"client" => "55.3.244.1",
"method" => "GET",
"request" => "/index.html",
"bytes" => "15824",
"duration" => "0.043"
}
←[32mPipeline started {:level=>:info}←[0m
Logstash startup completed

←[33mFailed action. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"logstash-2015.11.02", :_type=>"logs", :_routing=>nil}, #<LogStash::Event:0x38b449 @metadata_accessors=#<LogStash::Util::Accessors:0x185e95d @store={"path"=>"C:/basefarm/logstash-2.0.0-beta2/logstash-2.0.0-beta2/g1a.conf",
retry_count"=>0}, @lut={"[path]"=>[{"path"=>"C:/basefarm/logstash-2.0.0-beta2/logstash-2.0.0-beta2/g1a.conf", "retry_count"=>0}, "path"]}>, @cancelled=false, @data={"message"=>"55.3.244.1 GET /index.html 15824 0.043\r", "@version"=>"1", "@timestamp"=>"2015-11-02T05:36:54.660Z", "host"=>"Q07500-2K1
"path"=>"C:/basefarm/logstash-2.0.0-beta2/logstash-2.0.0-beta2/g1a.conf", "client"=>"55.3.244.1", "method"=>"GET", "request"=>"/index.html", "bytes"=>"15824", "duration"=>"0.043"}, @metadata={"path"=>"C:/basefarm/logstash-2.0.0-beta2/logstash-2.0.0-beta2/g1a.conf", "retry_count"=>0}, @accessors=#
ogStash::Util::Accessors:0x2a0546 @store={"message"=>"55.3.244.1 GET /index.html 15824 0.043\r", "path"=>[{"message"=>"55.3.244.1 GET /index.html 15824 0.043\r", "@version"=>"1", "@timestamp"=>"2015-11-02T05:36:54.660Z", "host"=>"Q07500-2K1", "path"=>"C:/basefarm/logstash-2.0.0-beta2/logstash-
0.0-beta2/g1a.conf", "client"=>"55.3.244.1", "method"=>"GET", "request"=>"/index.html", "bytes"=>"15824", "duration"=>"0.043"}, "path"], "message"=>[{"message"=>"55.3.244.1 GET /index.html 15824 0.043\r", "@version"=>"1", "@timestamp"=>"2015-11-02T05:36:54.660Z", "host"=>"Q07500-2K1", "path"=>"C:/
sefarm/logstash-2.0.0-beta2/logstash-2.0.0-beta2/g1a.conf", "client"=>"55.3.244.1", "method"=>"GET", "request"=>"/index.html", "bytes"=>"15824", "duration"=>"0.043"}, "message"], "client"=>[{"message"=>"55.3.244.1 GET /index.html 15824 0.043\r", "@version"=>"1", "@timestamp"=>"2015-11-02T05:36:54.
0Z", "host"=>"Q07500-2K1", "path"=>"C:/basefarm/logstash-2.0.0-beta2/logstash-2.0.0-beta2/g1a.conf", "client"=>"55.3.244.1", "method"=>"GET", "request"=>"/index.html", "bytes"=>"15824", "duration"=>"0.043"}, "client"], "method"=>[{"message"=>"55.3.244.1 GET /index.html 15824 0.043\r", "@version"=>
", "@timestamp"=>"2015-11-02T05:36:54.660Z", "host"=>"Q07500-2K1", "duration"=>[{"message"=>"55.3.244.1 GET /index.html 15824 0.043\r", "@version"=>"1", "@timestamp"=>"2015-11-02T05:36:54.660Z", "host"=>"Q07500-2K1", "path"=>"C:/basefarm/logstash-2.0.0-beta2/logstash-2.0.0-beta2/g1a.conf", "client"=>"55.3.244.1", "method"=>"
T", "request"=>"/index.html", "bytes"=>"15824", "duration"=>"0.043"}, "duration"], "type"=>[{"message"=>"55.3.244.1 GET /index.html 15824 0.043\r", "@version"=>"1", "@timestamp"=>"2015-11-02T05:36:54.660Z", "host"=>"Q07500-2K1", "path"=>"C:/basefarm/logstash-2.0.0-beta2/logstash-2.0.0-beta2/g1a.co
", "client"=>"55.3.244.1", "method"=>"GET", "request"=>"/index.html", "bytes"=>"15824", "duration"=>"0.043"}, "type"]}>>], :response=>{"index"=>{"_index"=>"logstash-2015.11.02", "_type"=>"logs", "_id"=>nil, "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"mapping [default]
"caused_by"=>{"type"=>"mapper_parsing_exception", "reason"=>"Mapping definition for [geoip] has unsupported parameters: [path : full]"}}}}, :level=>:warn}←[0m