Selecting "file" in logstash input does not send to elasticsearch

I am trying to send log files from logstash to elasticsearch.
I have completed the whole setup and the test to send the standard input to elasticsearch has passed.
Now, I want to send the file to elasticsearch by specifying the file, but nothing is sent to elasticsearch.
Logstash seems to be working fine.
(At least it's not outputting any errors at runtime.
What do I need to check and fix to send the file to elasticsearch?

Here is the logstash config file that I created.


input {
  #stdin { } <-OK

  file {
    path => ["/usr/local/elastic/datasets/logstash-tutorial.log"]

output {
  elasticsearch {
    hosts => ["http://***. ***. ***. ***:9200"]
    index => "test-%{+YYYY.MM.dd}"

Then, run the following.
/usr/share/logstash/bin/logstash -f /etc/logstash/logstash-sample.conf --config.reload.automatic


Is the file already complete or are you adding additional content after LogStash started?

By default, LogStash tails files and starts at the end if the files already exist: File input plugin | Logstash Reference [7.11] | Elastic

Depending on your usecase you might do the following:

  1. If your files are always content complete you might want to switch the mode from tail to read: File input plugin | Logstash Reference [7.11] | Elastic
  2. If your files are not content complete(e.g. logfiles) but you do not want the current content to be lost switch the start_position of logstash to beginning: File input plugin | Logstash Reference [7.11] | Elastic

Note that the start position is only used if logstash finds a new file. After that LogStash will persist the current position in the sincedb.

Best regards

1 Like

Thank you for your prompt reply.

In the above sample, I used an "already completed" file.
However, it is not complete because what I want to put into practical use in the future is the log file.

I have tried the two cases you gave me.

mode => "tail"
start_position => "beginning"

But in both cases, the result is the same: no logs are sent to elasticsearch.
Do you have any other ideas?

Logstash will not read that file again because it already had.

As @Wolfram_Haussig suggested you should read / understand

about the sincedb here

To reread that file you need to clean out the sincedb data manually or while testing add

sincedb_path => "/dev/null"

In the file input section, this is good for testing then take out for production.

Depending on you system you may need to start Logstash with sudo when using /dev/null

1 Like


Thanks for your comment.

You are right! I heard very good things!
I specified the sincedb_path as you suggested and confirmed that it is registered in elasticsearch.

If I want to benefit from sincedb, what should I specify for sincedb_path?
If I am running logstash for the first time, I don't think sincedb exists, do you?

sincedb will be automatically created in the right place if you do not set it ... The sincedb_path will be shown in the Logstash logs during startup, you can set it if you want but really no need.

If I recall the sincedb is created upon starting Logstash the first time.

Remember /dev/null is really for dev and testing .

I kind of understood that sincedb is automatically created in the right place.
When I searched for sincedb, I found that in my environment
In my environment, there are several files called "sincedb".
I think some of them are from rewriting the config and running logstash several times.

However, this means that when I clean up sincedb to reload the files
I don't know which ones to delete.

Is there any way to determine these?

Or is it possible to name the sincedb for each type of log?

For example, if I want to run nginx, apache
sincedb_nginx, sincedb_apache or
, etc.

Yes, as described in the documentation the config setting sincedb_pathpoints to a file path for this specific file input sincedb. So, setting sincedb_path to <your directory>/nginx/sincedb would work.

Otherwise, as @stephenb has pointed out you do not necessarily need to delete the sincedb. You could temporarily point your sincedb_path to /dev/null for reading the existing entries. after this is done you can reset this config change and restart LogStash. After this, LogStash will read all new entries as expected.

1 Like

Thank you for your comment.

I have confirmed that sincedb can be placed in any directory.
I'll specify /dev/null during the test of file reading, and create sincedb in the given path during operation.

Finally, I will post the logstash-sample.conf that I created.

input {
  file {
    path => ["/var/log/nginx/*.log"]
    start_position => "beginning"
    #sincedb_path => "/dev/null"
    sincedb_path => "/usr/local/elastic/logstash/nginx/sincedb" # Any file path

output {
  elasticsearch {
    hosts => ["http://***. ***. ***. ***:9200"]
    index => "test-%{+YYYY.MM.dd}"

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.