Path not loading in logstash 7.3

hello,
Good Day !!
Please help me on this

installed ELK 7.3 in Windows 10 Laptop

below mentioned data is the conf file i am trying to execute in Logslatsh and no error. but the file (path) is not loading except that it is working fine !!

input {
file {
path =>"C:\logstash-7.3.0\bin\apache.log"
start_position => "beginning"
ignore_older => 0
sincedb_path => "C:\logstash-7.3.0\bin\db"
type => "logs"
}
stdin{}
}
output
{
elasticsearch {
hosts =>["localhost:9200"]
index => "logfile"
}
stdout {codec => rubydebug}
}


after reading multiple article i did inserted

start_position => "beginning"
ignore_older => 0
sincedb_path => "C:\logstash-7.3.0\bin\db"
type => "logs"

hoping that " the path will load " and work in kibana, but it did not worked and even out put of the log file (path) is also not displayed in CMD

Thanks

Do not use backslash in the path option of a file input. Use forward slash.

Dear Badger
I am great full for your reply,
as per you guide lines i changed to forward slash but no luck
FYI
path =>"C:/logstash-7.3.0/bin/apache.log"
sincedb_path => "C:/logstash-7.3.0/bin/db"

please check this ! and help me out

This says to ignore any files more than zero seconds old, so it ignores all files. Remove it.

Hi Badger,
thanks a lot for your replay

removed the " ignore_older => 0 " and checked but no progress still same issue


i tried with and with out below mentioned codes (even with combinations) and result is same
start_position => "beginning"
ignore_older => 0
sincedb_path => "C:\logstash-7.3.0\bin\db"
type => "logs"


i wonder why i am struggling for a small code, if possible please give the code. let me try that also
concept ::
i just need to read a file (example path C:) which contains random data and i need to see the result in stdout

Gratitude
Sachin

Set the log.level to trace (either in logstash.yml or using '--log.level trace' on the command line) and see what filewatch has to say.

Appreciate your fast response !!

command :: logstash -f path.conf --log.level trace

Result::

how about you rename your config file from path.conf to something else.
it might be some internal env called path.

and keep only

     sincedb_path => "/dev/null"
      start_position => "beginning"

and test it.
I saw that you don't have filter section lets put it in and keep it blank
filter {}

Also remove,
stdin {}

You need to find a set of line in the trace like this

[2019-08-24T16:02:30,115][TRACE][filewatch.sincedbcollection] open: reading from /dev/null
[2019-08-24T16:02:30,140][TRACE][filewatch.sincedbcollection] open: count of keys read: 0
[2019-08-24T16:02:30,322][TRACE][filewatch.discoverer     ] discover_files {"count"=>1}
[2019-08-24T16:02:30,520][TRACE][filewatch.discoverer     ] discover_files handling: {"new discovery"=>true, "watched_file details"=> ...
[2019-08-24T16:02:30,557][TRACE][filewatch.sincedbcollection] associate: finding {"inode"=>"12714756", "path"=>"/home/foo.txt"}
[2019-08-24T16:02:30,565][TRACE][filewatch.sincedbcollection] associate: unmatched
[2019-08-24T16:02:30,774][TRACE][filewatch.tailmode.processor] Delayed Delete processing
[2019-08-24T16:02:30,802][TRACE][filewatch.tailmode.processor] Watched + Active restat processing
[2019-08-24T16:02:30,885][TRACE][filewatch.tailmode.processor] Rotation In Progress processing
[2019-08-24T16:02:30,914][TRACE][filewatch.tailmode.processor] Watched processing
[2019-08-24T16:02:30,933][TRACE][filewatch.tailmode.handlers.createinitial] handling: foo.txt
[2019-08-24T16:02:30,994][TRACE][filewatch.tailmode.handlers.createinitial] opening foo.txt
[2019-08-24T16:02:31,087][TRACE][filewatch.tailmode.handlers.createinitial] handle_specifically opened file handle: 85, path: foo.txt
[2019-08-24T16:02:31,175][TRACE][filewatch.tailmode.handlers.createinitial] add_new_value_sincedb_collection ...
[2019-08-24T16:02:31,242][TRACE][filewatch.tailmode.processor] Active - file grew: foo.txt: new size is 677, bytes read 0
[2019-08-24T16:02:31,249][TRACE][filewatch.tailmode.handlers.grow] handling: foo.txt
[2019-08-24T16:02:31,293][TRACE][filewatch.tailmode.handlers.grow] reading... {"iterations"=>1, "amount"=>677, "filename"=>"foo.txt"}
[2019-08-24T16:02:31,306][DEBUG][filewatch.tailmode.handlers.grow] read_to_eof: get chunk
[2019-08-24T16:02:31,356][DEBUG][logstash.inputs.file     ] Received line

Hi Badger,

Thanks for your support

with the help of a friend Sundaramoorthy, we resolve this issue

Reason for the issue :: apache.log file is inside the bin folder
Resolution :: I moved apache.log to desktop

code::

input {
file {
path =>"C:/Users/skumarp8/Desktop/ELK Stack/apche.log"
start_position => "beginning"
type => "logs"
}
stdin{}
}
output
{
elasticsearch {
hosts =>[ "localhost:9200" ]
index => "logfile"
}
stdout {codec => rubydebug}
}

Gratitude
Sachin

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.