My logstash remains in started state without any output?

HI

ES 2.4.0
Logstash 5.5.1
I want to send the documents from local machine to elasticsearch using logstash? My config file is

input {
file {
path => "C:\Users\571952\Desktop\logstash-5.1.1\logstash"
start_position => "beginning"
codec => json_lines

}
}
output {
elasticsearch {
hosts => "http://localhost:9200/"
index => "rearticles"
codec => json_lines

}
stdout { codec => rubydebug }
}

The error :
Eventhough i given the path it is showing as null

[2017-01-12T15:05:06,181][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs
updated {:changes=>{:removed=>[], :added=>["http://localhost:9200/"]}}
[2017-01-12T15:05:06,181][INFO ][logstash.outputs.elasticsearch] Running health check to
see if an Elasticsearch connection is working {:url=>#<URI::HTTP:0x15c34c7 URL:http://loc
alhost:9200/>, :healthcheck_path=>"/"}
[2017-01-12T15:05:06,290][WARN ][logstash.outputs.elasticsearch] Restored connection to E
S instance {:url=>#<URI::HTTP:0x15c34c7 URL:http://localhost:9200/>}
[2017-01-12T15:05:06,290][INFO ][logstash.outputs.elasticsearch] Using mapping template f
rom {:path=>nil}
[2017-01-12T15:05:06,337][INFO ][logstash.outputs.elasticsearch] Attempting to install te
mplate {:manage_template=>{"template"=>"logstash-", "settings"=>{"index.refresh_interval
"=>"5s"}, "mappings"=>{"default"=>{"_all"=>{"enabled"=>true, "omit_norms"=>true}, "dyna
mic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"strin
g", "mapping"=>{"type"=>"string", "index"=>"analyzed", "omit_norms"=>true, "fielddata"=>{
"format"=>"disabled"}}}}, {"string_fields"=>{"match"=>"
", "match_mapping_type"=>"string"
, "mapping"=>{"type"=>"string", "index"=>"analyzed", "omit_norms"=>true, "fielddata"=>{"f
ormat"=>"disabled"}, "fields"=>{"raw"=>{"type"=>"string", "index"=>"not_analyzed", "doc_v
alues"=>true, "ignore_above"=>256}}}}}, {"float_fields"=>{"match"=>"", "match_mapping_ty
pe"=>"float", "mapping"=>{"type"=>"float", "doc_values"=>true}}}, {"double_fields"=>{"mat
ch"=>"
", "match_mapping_type"=>"double", "mapping"=>{"type"=>"double", "doc_values"=>tru
e}}}, {"byte_fields"=>{"match"=>"", "match_mapping_type"=>"byte", "mapping"=>{"type"=>"b
yte", "doc_values"=>true}}}, {"short_fields"=>{"match"=>"
", "match_mapping_type"=>"short
", "mapping"=>{"type"=>"short", "doc_values"=>true}}}, {"integer_fields"=>{"match"=>"",
"match_mapping_type"=>"integer", "mapping"=>{"type"=>"integer", "doc_values"=>true}}}, {"
long_fields"=>{"match"=>"
", "match_mapping_type"=>"long", "mapping"=>{"type"=>"long", "d
oc_values"=>true}}}, {"date_fields"=>{"match"=>"", "match_mapping_type"=>"date", "mappin
g"=>{"type"=>"date", "doc_values"=>true}}}, {"geo_point_fields"=>{"match"=>"
", "match_ma
pping_type"=>"geo_point", "mapping"=>{"type"=>"geo_point", "doc_values"=>true}}}], "prope
rties"=>{"@timestamp"=>{"type"=>"date", "doc_values"=>true}, "@version"=>{"type"=>"string
", "index"=>"not_analyzed", "doc_values"=>true}, "geoip"=>{"type"=>"object", "dynamic"=>t
rue, "properties"=>{"ip"=>{"type"=>"ip", "doc_values"=>true}, "location"=>{"type"=>"geo_p
oint", "doc_values"=>true}, "latitude"=>{"type"=>"float", "doc_values"=>true}, "longitude
"=>{"type"=>"float", "doc_values"=>true}}}}}}}}
[2017-01-12T15:05:06,353][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output
{:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["http://localhost:9200/"]}
[2017-01-12T15:05:06,353][INFO ][logstash.pipeline ] Starting pipeline {"id"=>"mai
n", "pipeline.workers"=>2, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>5, "pipeli
ne.max_inflight"=>250}
[2017-01-12T15:05:06,353][INFO ][logstash.pipeline ] Pipeline main started
[2017-01-12T15:05:06,493][INFO ][logstash.agent ] Successfully started Logstash
API endpoint {:port=>9600}
_

Please check if your User home directory contains a .sincedb file which tracks the position of each read file by logstash.

You can also check this sincedb usage by turning on --log.level=debug

yeah i had seen it it has contents like this:
0 0 0

please try with --log.level=debug and attach the log file in a gist or something like that (please make sure private things are masked)

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.