Hi i'm trying to harvest the docker logs of one of our application and configured outputs to Elasticsearch and File. Its neither harvesting any . I'm sharing the filebeat logs for reference

2016/09/15 22:13:01.859214 outputs.go:126: INFO Activated elasticsearch as output plugin.
2016/09/15 22:13:01.859242 file.go:39: INFO File output base filename set to: filebeat_log
2016/09/15 22:13:01.859252 file.go:50: INFO Rotate every bytes set to: 10240000
2016/09/15 22:13:01.859256 file.go:57: INFO Number of files set to: 7
2016/09/15 22:13:01.859301 outputs.go:126: INFO Activated file as output plugin.
2016/09/15 22:13:01.859377 publish.go:288: INFO Publisher name: Twilio-blue
2016/09/15 22:13:01.859592 async.go:78: INFO Flush Interval set to: 1s
2016/09/15 22:13:01.859599 async.go:84: INFO Max Bulk Size set to: 50
2016/09/15 22:13:01.859627 async.go:78: INFO Flush Interval set to: -1ms
2016/09/15 22:13:01.859630 async.go:84: INFO Max Bulk Size set to: -1
2016/09/15 22:13:01.859638 beat.go:147: INFO Init Beat: filebeat; Version: 1.2.3
2016/09/15 22:13:01.860500 beat.go:173: INFO filebeat sucessfully setup. Start running.
2016/09/15 22:13:01.860542 registrar.go:68: INFO Registry file set to: /var/lib/filebeat/registry
2016/09/15 22:13:01.860615 prospector.go:133: INFO Set ignore_older duration to 5m0s
2016/09/15 22:13:01.860720 prospector.go:133: INFO Set close_older duration to 1h0m0s
2016/09/15 22:13:01.860734 prospector.go:133: INFO Set scan_frequency duration to 10s
2016/09/15 22:13:01.860743 prospector.go:93: INFO Input type set to: log
2016/09/15 22:13:01.860752 prospector.go:133: INFO Set backoff duration to 1s
2016/09/15 22:13:01.860760 prospector.go:133: INFO Set max_backoff duration to 10s
2016/09/15 22:13:01.860769 prospector.go:113: INFO force_close_file is disabled
2016/09/15 22:13:01.860795 prospector.go:143: INFO Starting prospector of type: log
2016/09/15 22:13:01.861072 crawler.go:78: INFO All prospectors initialised with 1 states to persist
2016/09/15 22:13:01.861121 registrar.go:87: INFO Starting Registrar
2016/09/15 22:13:01.861184 publish.go:88: INFO Start sending events to output
2016/09/15 22:13:01.861515 spooler.go:77: INFO Starting spooler: spool_size: 2048; idle_timeout: 5s
2016/09/15 22:13:01.861890 log.go:113: INFO Harvester started for file: /var/lib/docker/containers/1a51c1c3322cf648da8fb49f3de0ed53651a7f9f4d6ed37a49eb69e87ce53602/1a51c1c3322cf648da8fb49f3de0ed53651a7f9f4d6ed37a49eb69e87ce53602-json.log

My filebeat version is filebeat version 1.2.3 (amd64)
Elasticsearch version is Version: 2.1.0, Build: 72cd1f1/2015-11-18T22:40:03Z, JVM: 1.8.0_66-internal

Below is my filebeat.yml config

################### Filebeat Configuration Example #########################

############################# Filebeat ######################################
filebeat:
 
  prospectors:
    
      paths:
        - /var/lib/docker/containers/1a51c1c3322c*/*.log

      
      encoding: utf-8

      
      input_type: log
      include_lines: ["Streaming"]

      
      ignore_older: 5m

      
      scan_frequency: 10s

      
  idle_timeout: 5s

  
  registry_file: /var/lib/filebeat/registry

 


###############################################################################
############################# Libbeat Config ##################################
# Base config file used by all other beats for using libbeat features

############################# Output ##########################################


output:



  ### Elasticsearch as output
  elasticsearch:
    
    hosts: ["xx.xx.xx.xx:9200"]

    
    index: "oneroom"



  ### File as output
  file:
    
    path: "/opt/filebeat"


    
    filename: filebeat_log

    
    rotate_every_kb: 10000

    
    number_of_files: 7


 


############################# Shipper #########################################

shipper:
  
  name: "Twilio-blue"

Can you share some more details about your setup. Is filebeat running in the same container as the app or in a separate container? Can you enable debug level logging and share the output? I would expect the interesting logs to start after the last line you published.

Hi ruflin, application docker is running on separate container. i have enabled the logging and this is the config below:

logging:

  to_syslog: true
  
  to_files: true

  path: /var/log/mybeat

  name: mybeat

  rotateeverybytes: 10485760 # = 10MB

  keepfiles: 7 

After enabling the logging i have restarted the docker but i dont see any logs in the path specified for it.

And i see below error in the filebeat container logs. I'm not sure why ?

2016/09/16 17:53:25.789205 geolite.go:24: INFO GeoIP disabled: No paths were set under output.geoip.paths

Your logging config seems to off somehow. For filebeat 1.x use the following config to use logging to file:

logging:
  to_files: true
  files:
    path: /var/log/mybeat
    name: mybeat
    rotateeverybytes: 10485760 # = 10MB
    keepfiles: 7
  level: debug

The INFO message you saw above can be ignored.

This topic was automatically closed after 21 days. New replies are no longer allowed.