I use Filebeat to forward logs from another server to Logstash and then Logstash sends them to Elasticsearch. 
In filebeat.yml I specify output as Logstash but in logstash.conf I don't specify output at all. But in Kibana I can see logs are still comming for index logstash-*.
Could anybody explain how it is possible to get logs without output in logstash.conf file? 
(I run ELK in 3 Docker containers, linking with Elasticsearch)
filebeat.yml:
filebeat.prospectors:
- input_type: log
  paths:
    - /var/log/*.log
registry_file: /var/lib/filebeat/registry
output.logstash:
  hosts: ["my_host:5044"]
  ssl.certificate_authorities: ["/etc/pki/tls/certs/logstash-forwarder.crt"]
 
logstash.conf:
input {
  beats {
    port => 5044
    ssl => true
    ssl_certificate => "/usr/share/logstash/ssl/logstash-forwarder.crt"
    ssl_key => "/usr/share/logstash/ssl/logstash-forwarder.key"
  }
} 
             
            
               
               
               
            
            
           
          
            
              
                warkolm  
                (Mark Walkom)
               
              
                  
                    April 22, 2017,  3:06am
                   
                   
              2 
               
             
            
              Is there more to your LS config?
             
            
               
               
               
            
            
           
          
            
            
              Only filter, but it doesn't matter. I tried to put wrong server as output and it still worked, it also worked without output as well.
filter {
  if [type] == "apache-access" {
    grok {
      match => [ "message", "%{COMBINEDAPACHELOG}" ]
    }
  }
} 
             
            
               
               
               
            
            
           
          
            
            
              Correct, there is no output. And you can see the output for Filebeat is Logstash but not Elasticsearch. 
This is what I have in logstash.conf in Docker container
/usr/share/logstash/pipeline/logstash.conf
input {
  beats {
    port => 5044
    ssl => true
    ssl_certificate => "/usr/share/logstash/ssl/logstash-forwarder.crt"
    ssl_key => "/usr/share/logstash/ssl/logstash-forwarder.key"
  }
}
filter {
  if [type] == "apache-access" {
    grok {
      match => [ "message", "%{COMBINEDAPACHELOG}" ]
    }
  }
}
 
But I still can see new logs in Kibana for index logstash-* that came from the server with Filebeat installed. 
How is it possible?
             
            
               
               
               
            
            
           
          
            
            
              I am using logstash 2.4.0
Losgstash process the logs for few minutes and it stops processing.I get the below error in logs.
{:timestamp=>"2017-04-22T20:23:27.947000+0200", :message=>"Flushing buffer at interval", :instance=>"#<LogStash::Outputs::ElasticSearch::Buffer:0x2266677 @operations_mutex=#Mutex:0xce19843 , @max_size=500, @operations_lock=#Java::JavaUtilConcurrentLocks::ReentrantLock:0x8de0d07 , @submit_proc=#Proc:0x7cf8406a@/opt/nedi/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.7.1-java/lib/logstash/outputs/elasticsearch/common.rb:57 , @logger=#<Cabin::Channel:0x7c5bfc36 @metrics=#<Cabin::Metrics:0x495708f @metrics_lock=#Mutex:0x2df36d37 , @metrics={}, @channel=#<Cabin::Channel:0x7c5bfc36 ...>>, @subscriber_lock=#Mutex:0x133683f3 , @level=:debug, @subscribers={13206=>#<Cabin::Subscriber:0x1f91786d @output=#<Cabin::Outputs::IO:0x29bd347d @io=#<File:/var/nedi/logs/logstash_popfile_to_es.log>, @lock=#Mutex:0xe3313a2 >, @options={}>, 13208=>#<Cabin::Subscriber:0x45688d84 @output=#<Cabin::Outputs::IO:0x51aed754 @io=#<IO:fd 1>, @lock=#Mutex:0x11f01fc0 >, @options={:level=>:fatal}>}, @data={}>, @last_flush=2017-04-22 20:23:26 +0200, @flush_interval=1, @stopping=#Concurrent::AtomicBoolean:0x64d098f , @buffer=[], @flush_thread=#<Thread:0x217e6a65 run>>", :interval=>1, :level=>:debug, :file=>"logstash/outputs/elasticsearch/buffer.rb", :line=>"90", :method=>"interval_flush"}
Please suggest
             
            
               
               
               
            
            
           
          
            
              
                warkolm  
                (Mark Walkom)
               
              
                  
                    April 23, 2017,  8:21pm
                   
                   
              7 
               
             
            
              Create your own thread please.
             
            
               
               
               
            
            
           
          
            
              
                warkolm  
                (Mark Walkom)
               
              
                  
                    April 23, 2017,  8:22pm
                   
                   
              8 
               
             
            
              You must have another config file in the directory with an output that is being read.
             
            
               
               
               
            
            
           
          
            
              
                system  
                (system)
                  Closed 
               
              
                  
                    May 21, 2017,  8:26pm
                   
                   
              9 
               
             
            
              This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.