Logstash is not able to ingest data into elastic search after 1-2 hours of time span?

what should be the possible, as i have already checked below points.

  1. No connection issue.
  2. No error message received at log stash.
  3. I have tried to sending logs from a separate log file it is getting transferred successfully but while transferring the real time logs it is creating problem.

Kindly suggest your valuable suggestion in order to resolve the issue.

Thanks
Deepak Lohar

++ Adding More Information to this, kindly check below points and:

  1. we have limit the log files to 15MB size, once allocated 15 MB get filled completely than it will rename (only adding suffix to the current file name) .
  2. Again 15MB will allocate to same file and logs are keep on this newly allocated file and repeat the same continuously.
  3. So switching of a current log file to new log file might be the cause of breaking continuity.

Kindly suggest your input if you might have faced this kind of issue earlier or any other alternative to this solution.

Reply here or reach me out directly through personal mail: Deepaklohar275@gmail.com

Thanks & Regards
Deepak Lohar

Do you mean "logstash doesn't ingest data at all" or "logstash ingests data for 1/2 hours and then stops ingesting the data?"

Thanks for your response :slight_smile:
logstash if ingesting data for short time span only.

As i have provided in additional information, that we have limited the file size so may be because of switching of log files during backup or log files are getting locked for few seconds by primary application.

I am suspecting that above scenario could interrupting the normal execution of logstash.

let me know if you want further clarification on this.

Thanks

Deepak Lohar

8827575920

Files will be available for read operation even though they are being updated. Your scenario should not cause anything that will end up breaking logstash. Can you share your ingestion pipeline?

Hi Parab,

Sorry for inconvenience as ingestion pipeline is not managing at our end so i do not have any idea about this configuration but there is not having any issue at ingestion pipeline config just now i have checked with the concern person.

Really appreciate your answer given in above reply, logstash successfully ingested logs from file (15MB) but when new files get created and renamed to the old file than it is breaking the continuity and not able to picking up logs from the new file.

But if we restart the service than again it will take logs from the current log and stopped when next backup occurs.

Thanks
Deepak Lohar

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.