Parsing logs from file

Hello, i tried to parse logs from file and send data to elasticsearch. And for some reason, I do not parse file.

  1. I use the following "logstash.conf" file:
    input {
    file{
    path=>"D:\Project\logs.log"
    start_position => "beginning" }}
    filter {
    grok { match=> {"message"=> "(%{TIMESTAMP_ISO8601:timestamp} %{LOGLEVEL:loglevel} %{GREEDYDATA:message})"}}
    if "_grokparsefailure" in [tags]{ drop{} }
    }
    output {
    elasticsearch {
    hosts => ["localhost:9200"]
    index => "logs-%{+YYYY.MM.dd}"}
    stdout { codec => rubydebug }}

  2. There is data from my "logs.log" file:

2018-12-06 23:00:00,068 INFO {Cluster} ETK-3-U-SET01 10.78.80.57 Alive 8.011134s
2018-12-06 23:00:00,068 INFO {External} - no external modules loaded yet, complete test transaction to load them -
2018-12-06 23:00:00,068 WARN {Job} END

  1. And i have the following output in logstash:

[2019-01-09T12:20:49,075][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//localhost:9200"]}
[2019-01-09T12:20:49,100][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil}
[2019-01-09T12:20:49,129][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"default"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
[2019-01-09T12:20:49,680][INFO ][logstash.inputs.file ] No sincedb_path set, generating one based on the "path" setting {:sincedb_path=>"D:/Programs/ELC/logstash654/data/plugins/inputs/file/.sincedb_2844bc93237b9dafac4fa003179d4180", :path=>["D:\Project\logs.log"]}
[2019-01-09T12:20:49,723][INFO ][logstash.pipeline ] Pipeline started successfully {:pipeline_id=>"main", :thread=>"#<Thread:0x6e9597af run>"}
[2019-01-09T12:20:49,792][INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>}
[2019-01-09T12:20:49,799][INFO ][filewatch.observingtail ] START, creating Discoverer, Watch with file and sincedb collections
[2019-01-09T12:20:50,105][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}

Please help me solve this issue.

Make sure to change the path as shown below,

path=>"D:/Project/logs.log"

In the output file observed that you have used sincedb_path but not found in input config file..??
By default for windows use,

sincedb_path => "NUL"

thanks balumurari1

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.