Not able to get see logs until I delete the index

Hi ,

I am not able to see logs next day , until I delete the index manually from Kibana
Below is the status of my logstash and I am not able to see any error logs even I put the debug mode in logstash.yml

Below is the pipelines.yaml for problematic index.

-  a-b-c-log-processing
  pipeline.batch.size: 5000
  config.string: |
    input { pipeline { address => a_b_c } }
    filter {
      dissect { mapping => { "message" => "[%{LOGLEVEL}] %{request}" } }
      mutate { remove_field => ["message"] }
      dissect { mapping => {
      mutate { remove_field => ["request","ORDERSEQNO","agent","log","host","input","ecs","tags"] }
      date { match => ['LOGDATETIME', 'YYYY-MM-dd HH:mm:ss.SSS'] }
    output {
      elasticsearch {
        hosts => [ "http://xxxx:9200","http://xxxx:9200","http://xxxx:9200" ]
        index => "a-b-c-%{+YYYY.MM.dd}"
        user => "xxxxx"
        password => "xxxxxxxxxxxxxx"


The background is that we have 3 application which genearte the logs in similar format , we used filebeat to send those logs to the logstash , where we apply the filter . In a,b and c applicition , one applicton "c" which is producer of 90% of logs. I have no problem to see logs of a, b appliciton but C appliciton logs I cannot see in kibana until I delete the index. but if I remove the filter in pipelines.yaml then i have no problem . I am not even getting any error as well in logstash and elastisearch . I am not sure how to trobleshoot this further. Please help it out , I have been stuck this issue with last 3 weeks and still I donot have any clue , I also tried to increase the batch size from 50 to 500 to 5000 but it the problem still persists

Can you create another output just for the c producer and manually send it to stdout/logfile/another index?
Just to help you troubleshoot what's going on.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.