Logstash input file not working

Hello, Greetings, I'm new to ELK Stack and I'm trying to put some router logs to elasticsearch via Logstash but it's not working. it doesn't create the index on elasticsearch please help.
My Config file is as below

input {
file {
path => "/User/fazal/Desktop/logstach/data.txt"
start_position => "beginning"
sincedb_path => "/tmp/mysincedbfile"
codec => multiline {
pattern => "^type"
negate => "true"
what => "next"
}
}
}
filter {
grok {
match => { "message" => "%{WORD:type} %{WORD:channel} %{NUMBER:use} %{WORD:bitrate}" }
}
}
output {
elasticsearch {
hosts => ["http://localhost:9200"]
index => "demo"

    }
    stdout { codec => rubydebug }
}</BBCode>

Data.txt

type=frequency channel="2447/20/gn" use=7.6% bitrate=39.1kbps
frequency-network-count=0 noise-floor=-107 frequency-station-count=4

type=frequency channel="2412/20/gn" use=8.4% bitrate=150.9kbps
frequency-network-count=5 noise-floor=-110 frequency-station-count=13

type=frequency channel="2417/20/gn" use=9.2% bitrate=58.6kbps
frequency-network-count=1 noise-floor=-112 frequency-station-count=3

Logstash Log

[2019-06-20T19:03:51,705][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2019-06-20T19:03:51,724][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"7.0.1"}
[2019-06-20T19:03:55,668][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>, :added=>[http://localhost:9200/]}}
[2019-06-20T19:03:55,955][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://localhost:9200/"}
[2019-06-20T19:03:56,032][INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>7}
[2019-06-20T19:03:56,036][WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: the type event field won't be used to determine the document _type {:es_version=>7}
[2019-06-20T19:03:56,090][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["http://localhost:9200"]}
[2019-06-20T19:03:56,090][INFO ][logstash.outputs.elasticsearch] Using default mapping template
[2019-06-20T19:03:56,132][INFO ][logstash.javapipeline ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>500, :thread=>"#<Thread:0x327461fa run>"}
[2019-06-20T19:03:56,281][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"index_patterns"=>"logstash-", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s", "number_of_shards"=>1}, "mappings"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}
[2019-06-20T19:04:01,785][INFO ][logstash.javapipeline ] Pipeline started {"pipeline.id"=>"main"}
[2019-06-20T19:04:01,865][INFO ][filewatch.observingtail ] START, creating Discoverer, Watch with file and sincedb collections
[2019-06-20T19:04:01,870][INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>}
[2019-06-20T19:04:02,202][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}

Are you appending new entries to the log?

No, I need to add entries from the data.txt to elasticsearch

start_position only has any effect when the file input sees a new file. It is possible that you ran logstash, the file input read the file and persisted the fact that it had done so in the sincedb, but for some reason it failed to write the data to elasticsearch. If you fixed the problem that caused it not to write the data and restarted logstash it would not re-read the file. Take a look at /tmp/mysincedbfile (it is a text file) and see if it has a line for /User/fazal/Desktop/logstach/data.txt. You may need to stop logstash, delete that line, then restart logstash, or perhaps (depending on your use case) you may want to surpress persistence of the sincedb using 'sincedb_path => "/dev/null"'

/tmp/mysincedbfile is empty and I have also tried to set sincedb_path => "/dev/null" but the same results. I don't understand why it isn't writing data to elasticsearch. is there any way to see deeper debug logs? to understand what's actually going wrong

Could be there is not a line that triggers the multiline filter to push the accumulated lines into the pipeline.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.