Sincedb file is being created empty

when i am trying to use file input plugin to created index it is not working as sincedb file is being created empty and in logs it says discovered files count =0
also when using disable sincedb it says open file /dev/null and not working
please help me

As far as I know sincedb is always empty when created as the plugin has not yet parsed anything. If you want help troubleshooting you will need to show exact error messages as well as the config that led to them.




these are the configurations

actually i am learning elk stack and have set up a stack doing this i am trying everything and when i was trying to send a file from logstash to Elasticsearch i was getting trouble and i googled people suggested o disable sincedb since it was making problem and i tried to disable sincedb but it didnt worked at all.
after weaks of troubleshooting i got to know that logstash is looking for file which is getting updated in 24 hours so i changed the file path to some other file which i choose logstash itself and it did worked with /dev/null n created index after i checked i comment the /dev/null and checked it was still working i dont know how and who write down the sincedb file and when it write paths or do i need to make any configuration for it
To know this all i again comment out file plugin which was working and given a new file plugin without defining sincedb path and i saw in debug logs that it has created a file automatically but it created empty becasue of that it says discovered file zero then i given sincedb path as dev/null and check if it is taking same sincedb file or taking null i was not able to understand that it was looking to open file in dev/null where as i given sincedb file to this path
so my questions are
1 when does exactly sincedb file get created and who writes the file and how does i got paths and innodes defined in file as i got it written two times but not sure what was cause that makes it working or what did i do
2 i am also confused if i have missed some configurations
3 is templates or type is necessary to provide to define data format in logstash since it is not in filebeat case

i hope you will read me may be i m irritating in telling problems so long but i tried my best to define you exact problem that i have been trying to fix for a month

Please do not post screen shots of text as these are very difficult to read and can not be searched or copied.

Providing just logs and config without any proper description of the perceived issue and what you think the problem is is not very helpful.

It seems that you have disabled sincedb by setting the path to /dev/null so I am not sure what you are expecting.

i am expecting if i have disable since db then it should look for file in path directly
but instead it is looking for file in /dev/null

The file input plugin creates and updates the sincedb file assuming it can be created and the path is not /dev/null. The plugin will update the file as data is read to keep track of what has been processed. It uses inode to uniquely identify files and will not work if you are updating or modifying the file. If you want the file to be reprocessed, instead delete and create a new file with the desired content as this will result in a new inode. If you disable sincedb the plugin will keep track of what has been processed in memory, but will reprocess everything at restart/startup.

file {
path => "/var/log/Elasticsearch/*"
type => "Elasticsearch-logs"

codec => "plain"

   start_position => "beginning"
   sincedb_path => "/dev/null"

}

this is file plugin i m using

[2022-05-30T12:41:42,133][INFO ][logstash.javapipeline ][main] Pipeline started {"pipeline.id"=>"main"}
[2022-05-30T12:41:42,144][INFO ][logstash.inputs.tcp ][main][c2bb6388889ad2844b16366097d501645d4a318f1050538ce0650782714c8b07] Starting tcp input listener {:address=>"192.168.56.101:5044", :ssl_enable=>false}
[2022-05-30T12:41:42,148][INFO ][logstash.inputs.tcp ][main][bf0444e31af6939c61fa9421788fb4bc85926403c3cce48245a40769a68b9d85] Starting tcp input listener {:address=>"192.168.56.101:10514", :ssl_enable=>false}
[2022-05-30T12:41:42,167][DEBUG][logstash.javapipeline ] Pipeline started successfully {:pipeline_id=>"main", :thread=>"#<Thread:0x5acaf213 run>"}
[2022-05-30T12:41:42,177][DEBUG][io.netty.channel.DefaultChannelId][main][c2bb6388889ad2844b16366097d501645d4a318f1050538ce0650782714c8b07] -Dio.netty.proce

these are logs for tcp plugins

[2022-05-30T12:41:42,416][DEBUG][io.netty.buffer.ByteBufUtil][main][c2bb6388889ad2844b16366097d501645d4a318f1050538ce0650782714c8b07] -Dio.netty.maxThreadLocalCharBufferSize: 16384
[2022-05-30T12:41:42,421][INFO ][filewatch.observingtail ][main][253c9d403246888fd6fb26b43cf0ca6e052727fa491c477f9103323b4e132841] START, creating Discoverer, Watch with file and sincedb collections
[2022-05-30T12:41:42,553][INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>}
[2022-05-30T12:41:42,622][DEBUG][filewatch.sincedbcollection][main][253c9d403246888fd6fb26b43cf0ca6e052727fa491c477f9103323b4e132841] open: reading from /var/lib/logstash/plugins/inputs/file/.sincedb_680aacbb58aa1d30be675c69ff3f7768
[2022-05-30T12:41:42,629][TRACE][filewatch.sincedbcollection][main][253c9d403246888fd6fb26b43cf0ca6e052727fa491c477f9103323b4e132841] open: count of keys read: 0
[2022-05-30T12:41:42,661][TRACE][filewatch.discoverer ][main][253c9d403246888fd6fb26b43cf0ca6e052727fa491c477f9103323b4e132841] discover_files {:count=>0}
[2022-05-30T12:41:43,703][DEBUG][filewatch.sincedbcollection][main][253c9d403246888fd6fb26b43cf0ca6e052727fa491c477f9103323b4e132841] writing sincedb (delta since last write = 1653894703)
[2022-05-30T12:41:43,714][TRACE][filewatch.sincedbcollection][main][253c9d403246888fd6fb26b43cf0ca6e052727fa491c477f9103323b4e132841] sincedb_write: /var/lib/logstash/plugins/inputs/file/.sincedb_680aacbb58aa1d30be675c69ff3f7768 (time = 2022-05-30 12:41:43 +0530)
[2022-05-30T12:41:43,731][TRACE][filewatch.sincedbcollection][main][253c9d403246888fd6fb26b43cf0ca6e052727fa491c477f9103323b4e132841] non_atomic_write: {:time=>2022-05-30 12:41:43 +0530}

these are logs for file plugin

now want to know when and how it will discover file
and create index in Elasticsearch

I would recommend that you read the docs for the plugin.

i will but can you tell me what i am doing wrong

what i have understand is are you saying that the sincedb file will not work if the file i want to send to Elasticsearch is getting modified ?

Correct. The plugin tails the file, which is what the sincedb file keep track of. If data is changed in the file without creating a new file, this is not detected.

ohh i just checked my logsatsh.plain.log file was created with new file
so this just mean that it got created new file that sincedb dected and send the data

so can you tell me how in this case it is not creating file or how could i make it enable

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.