Filebeat on solaris amd 64 failing

i picked up the library from here -https://beats-nightlies.s3.amazonaws.com/index.html?prefix=jenkins/filebeat/715-32446edc97bbfe933b1539a473e82a8b4a3a6942/

while running the svcadm i see the following exceptions in the SMF logs -

goroutine 11 [running]:
panic(0xa2cec0, 0xc820010090)
        /home/jenkins/tools/org.jenkinsci.plugins.golang.GolangInstallation/1.6/src/runtime/panic.go:464 +0x3ef
github.com/elastic/beats/libbeat/outputs/logstash.(*logstash).Close(0xc8201a66a0, 0x0, 0x0)
        /home/jenkins/workspace/Filebeat/src/github.com/elastic/beats/libbeat/outputs/logstash/logstash.go:158 +0x3f
github.com/elastic/beats/libbeat/publisher.(*outputWorker).onStop(0xc82007d4f0)
        /home/jenkins/workspace/Filebeat/src/github.com/elastic/beats/libbeat/publisher/output.go:60 +0x48
github.com/elastic/beats/libbeat/publisher.(*messageWorker).shutdown(0xc82007d4f0)
        /home/jenkins/workspace/Filebeat/src/github.com/elastic/beats/libbeat/publisher/worker.go:75 +0x3a
panic(0xa2cec0, 0xc820010090)
        /home/jenkins/tools/org.jenkinsci.plugins.golang.GolangInstallation/1.6/src/runtime/panic.go:426 +0x4f2
github.com/elastic/beats/libbeat/outputs/logstash.(*logstash).BulkPublish(0xc8201a66a0, 0xfffffd7ffee908c0, 0xc820502588, 0x1, 0xc82022e000, 0x7e8, 0x800, 0x0, 0x0)
        /home/jenkins/workspace/Filebeat/src/github.com/elastic/beats/libbeat/outputs/logstash/logstash.go:179 +0x8d
github.com/elastic/beats/libbeat/publisher.(*outputWorker).sendBulk(0xc82007d4f0, 0xc820058d28, 0xc82022e000, 0x7e8, 0x800)
        /home/jenkins/workspace/Filebeat/src/github.com/elastic/beats/libbeat/publisher/output.go:111 +0x190
github.com/elastic/beats/libbeat/publisher.(*outputWorker).onBulk(0xc82007d4f0, 0xc820058d28, 0xc82022e000, 0x7e8, 0x800)
        /home/jenkins/workspace/Filebeat/src/github.com/elastic/beats/libbeat/publisher/output.go:87 +0x39f
github.com/elastic/beats/libbeat/publisher.(*outputWorker).onMessage(0xc82007d4f0, 0xc8202dc1c0, 0x101, 0xfffffd7ffee908c0, 0xc820502588, 0x0, 0xc82022e000, 0x7e8, 0x800)
        /home/jenkins/workspace/Filebeat/src/github.com/elastic/beats/libbeat/publisher/output.go:70 +0x85
github.com/elastic/beats/libbeat/publisher.(*messageWorker).onEvent(0xc82007d4f0, 0xc8202dc1c0, 0x101, 0xfffffd7ffee908c0, 0xc820502588, 0x0, 0xc82022e000, 0x7e8, 0x800)
        /home/jenkins/workspace/Filebeat/src/github.com/elastic/beats/libbeat/publisher/worker.go:83 +0x62
github.com/elastic/beats/libbeat/publisher.(*messageWorker).run(0xc82007d4f0)
        /home/jenkins/workspace/Filebeat/src/github.com/elastic/beats/libbeat/publisher/worker.go:69 +0x1e2
created by github.com/elastic/beats/libbeat/publisher.(*messageWorker).init
        /home/jenkins/workspace/Filebeat/src/github.com/elastic/beats/libbeat/publisher/worker.go:57 +0x11e
[ Aug 11 05:59:17 Stopping because service exited with an error. ]
[ Aug 11 05:59:17 Executing start method ("/opt/gl-filebeat/bin/filebeat-solaris-amd64 -c /opt/gl-filebeat/bin/filebeat.yml"). ]

config test looks good - 
opt/gl-filebeat/bin/filebeat-solaris-amd64 -c /opt/gl-filebeat/bin/filebeat.yml -e -configtest
2016/08/11 05:49:13.300095 beat.go:263: INFO Home path: [/opt/gl-filebeat/bin] Config path: [/opt/gl-filebeat/bin] Data path: [/opt/gl-filebeat/bin/data] Logs path: [/opt/gl-filebeat/bin/logs]
2016/08/11 05:49:13.300161 logp.go:210: INFO Metrics logging every 30s
2016/08/11 05:49:13.300255 beat.go:174: INFO Setup Beat: filebeat; Version: 5.0.0-alpha5-git32446ed
2016/08/11 05:49:13.300400 logstash.go:96: INFO Max Retries set to: 3
2016/08/11 05:49:13.300457 outputs.go:87: INFO Activated logstash as output plugin.
2016/08/11 05:49:13.300667 publish.go:301: INFO Publisher name: app1.stg0
2016/08/11 05:49:13.300704 async.go:63: INFO Flush Interval set to: 1s
2016/08/11 05:49:13.300720 async.go:64: INFO Max Bulk Size set to: 2048
Config OK

my configuration file -

filebeat:
        registry_file: /opt/gl-filebeat/.gl-filebeat-registry
 
        spool_size: 2048
        prospectors:
         -
                paths:
                 - /var/svc/log/application.log
                

        input_type: log
        ignore_older: 1h
        max_bytes: 2097152
        fields:
                ENV:_ENV_
               PLATFORM: _PLATFORM_
                ASSEMBLY: ${hostname}
                IPv4: __IP__

        fields_under_root: false
 
# Setting tail_files to true means filebeat starts readding new files at the end
# instead of the beginning. If this is used in combination with log rotation
# this can mean that the first entries of a new file are skipped.
        tail_files: true

        multiline:
         pattern: "^[[:space:]]+(at|...)|^Caused by:"
         negate: false
         match: after
  
output:
  logstash:
  hosts: [_SERVER_HOST]
#  worker: 2
#  max_retries: -1
#  bulk_max_size: 2048
   
logging:
  level: info
  to_files: true 
  to_syslog: false 
  files:
   path: /var/log/filebeat
   name: filebeat.log
   keepfiles: 7

Where there any log lines before the panic? What is the exact command you were running to start filebeat?

Some more lines the goroutine try's to run and then fails. These log lines are recursive with different process number

goroutine 49 [running]:
panic(0xa2cec0, 0xc820010090)
        /home/jenkins/tools/org.jenkinsci.plugins.golang.GolangInstallation/1.6/src/runtime/panic.go:464 +0x3ef
github.com/elastic/beats/libbeat/outputs/logstash.(*logstash).Close(0xc8201629a0, 0x0, 0x0)
     /home/jenkins/workspace/Filebeat/src/github.com/elastic/beats/libbeat/outputs/logstash/logstash.go:158 +0x3f
github.com/elastic/beats/libbeat/publisher.(*outputWorker).onStop(0xc82028c0a0)
        /home/jenkins/workspace/Filebeat/src/github.com/elastic/beats/libbeat/publisher/output.go:60 +0x48
github.com/elastic/beats/libbeat/publisher.(*messageWorker).shutdown(0xc82028c0a0)
        /home/jenkins/workspace/Filebeat/src/github.com/elastic/beats/libbeat/publisher/worker.go:75 +0x3a
panic(0xa2cec0, 0xc820010090)
        /home/jenkins/tools/org.jenkinsci.plugins.golang.GolangInstallation/1.6/src/runtime/panic.go:426 +0x4f2
github.com/elastic/beats/libbeat/outputs/logstash.(*logstash).BulkPublish(0xc8201629a0, 0xfffffd7ffe9a7500, 0xc8204edf58, 0x1, 0xc8206c8000, 0x7e8, 0x800, 0x0, 0x0)
        /home/jenkins/workspace/Filebeat/src/github.com/elastic/beats/libbeat/outputs/logstash/logstash.go:179 +0x8d
github.com/elastic/beats/libbeat/publisher.(*outputWorker).sendBulk(0xc82028c0a0, 0xc8202b0d28, 0xc8206c8000, 0x7e8, 0x800)
        /home/jenkins/workspace/Filebeat/src/github.com/elastic/beats/libbeat/publisher/output.go:111 +0x190
github.com/elastic/beats/libbeat/publisher.(*outputWorker).onBulk(0xc82028c0a0, 0xc8202b0d28, 0xc8206c8000, 0x7e8, 0x800)
        /home/jenkins/workspace/Filebeat/src/github.com/elastic/beats/libbeat/publisher/output.go:87 +0x39f
github.com/elastic/beats/libbeat/publisher.(*outputWorker).onMessage(0xc82028c0a0, 0xc82016dc40, 0x101, 0xfffffd7ffe9a7500, 0xc8204edf58, 0x0, 0xc8206c8000, 0x7e8, 0x800)
        /home/jenkins/workspace/Filebeat/src/github.com/elastic/beats/libbeat/publisher/output.go:70 +0x85
github.com/elastic/beats/libbeat/publisher.(*messageWorker).onEvent(0xc82028c0a0, 0xc82016dc40, 0x101, 0xfffffd7ffe9a7500, 0xc8204edf58, 0x0, 0xc8206c8000, 0x7e8, 0x800)
        /home/jenkins/workspace/Filebeat/src/github.com/elastic/beats/libbeat/publisher/worker.go:83 +0x62
github.com/elastic/beats/libbeat/publisher.(*messageWorker).run(0xc82028c0a0)
        /home/jenkins/workspace/Filebeat/src/github.com/elastic/beats/libbeat/publisher/worker.go:69 +0x1e2
created by github.com/elastic/beats/libbeat/publisher.(*messageWorker).init
        /home/jenkins/workspace/Filebeat/src/github.com/elastic/beats/libbeat/publisher/worker.go:57 +0x11e
[ Aug 11 05:59:17 Stopping because service exited with an error. ]
[ Aug 11 05:59:17 Executing start method ("/opt/gl-filebeat/bin/filebeat-solaris-amd64 -c /opt/gl-filebeat/bin/filebeat.yml"). ]
panic: runtime error: invalid memory address or nil pointer dereference
        panic: runtime error: invalid memory address or nil pointer dereference
[signal 0xb code=0x1 addr=0x20 pc=0x62bd6f]

goroutine 11 [running]:
panic(0xa2cec0, 0xc820010090)
        /home/jenkins/tools/org.jenkinsci.plugins.golang.GolangInstallation/1.6/src/runtime/panic.go:464 +0x3ef
github.com/elastic/beats/libbeat/outputs/logstash.(*logstash).Close(0xc8201a66a0, 0x0, 0x0)
        /home/jenkins/workspace/Filebeat/src/github.com/elastic/beats/libbeat/outputs/logstash/logstash.go:158 +0x3f
github.com/elastic/beats/libbeat/publisher.(*outputWorker).onStop(0xc82007d4f0)
        /home/jenkins/workspace/Filebeat/src/github.com/elastic/beats/libbeat/publisher/output.go:60 +0x48
github.com/elastic/beats/libbeat/publisher.(*messageWorker).shutdown(0xc82007d4f0)
        /home/jenkins/workspace/Filebeat/src/github.com/elastic/beats/libbeat/publisher/worker.go:75 +0x3a
panic(0xa2cec0, 0xc820010090)

I have generated a manifest and imported that and it is running as following SMF service command

exec_method type="method" name="start" exec="/opt/gl-filebeat/bin/filebeat-solaris-amd64 -c /opt/gl-filebeat/bin/filebeat.yml" timeout_seconds="60"

can you enable debug or info log level and post filebeat logs before incident happens?

Could you also try to run filebeat manually and see what happens?

filebeat logs looks fine with DEBUG, yet SMF logs still throws those errors and my logstash server never receives any message from these hosts.

2016-08-11T14:50:43Z DBG Publish: {
"@timestamp": "2016-08-11T14:50:43.019Z",
"beat": {
"hostname": "app1.stg0",
"name": "app1.stg0"
},
"input_type": "log",
"message": "blah blah blah",
"offset": 19533,
"source": "/var/svc/log/site-app-version:preview1.log.5",
"type": "log"
}
2016-08-11T14:50:43Z DBG Publish: {
"@timestamp": "2016-08-11T14:50:43.019Z",
"beat": {
"hostname": "app1.stg0",
"name": "app1.stg0"
},
"input_type": "log",
"message": "blah blah blah",
"offset": 19860,
"source": "/var/svc/log/site-app-version:preview1.log.5",
"type": "log"
}
2016-08-11T14:50:43Z DBG Publish: {
"@timestamp": "2016-08-11T14:50:43.019Z",
"beat": {
"hostname": "app1.stg0",
"name": "app1.stg0"
},
"input_type": "log",
"message": "blah blah blah",
"offset": 43365,
"source": "/var/svc/log/site-app-version:default1.log.2",
"type": "log"
}
2016-08-11T14:50:43Z DBG Publish: {
"@timestamp": "2016-08-11T14:50:43.019Z",
"beat": {
"hostname":"app1.stg0",
"name": "app1.stg0"
},
"input_type": "log",
"message": "blah blah blah",
"offset": 20195,
"source": "/var/svc/log/site-app-version:preview1.log.5",
"type": "log"
}
2016-08-11T14:50:43Z DBG Publish: {
"@timestamp": "2016-08-11T14:50:43.019Z",
"beat": {
"hostname": "app1.stg0",
"name": "app1.stg0"
},
"input_type": "log",
"message": "blah blah blah",
"offset": 20522,
"source": "/var/svc/log/site-app-version:preview1.log.5",
"type": "log"
}
2016-08-11T14:50:43Z DBG output worker: publish 2020 events

I started manually -
./filebeat-solaris-amd64 -c filebeat.yml
panic: runtime error: invalid memory address or nil pointer dereference
panic: runtime error: invalid memory address or nil pointer dereference
[signal 0xb code=0x1 addr=0x20 pc=0x62bd6f]

goroutine 49 [running]:
panic(0xa2cec0, 0xc8200100d0)
/home/jenkins/tools/org.jenkinsci.plugins.golang.GolangInstallation/1.6/src/runtime/panic.go:464 +0x3ef
github.com/elastic/beats/libbeat/outputs/logstash.(*logstash).Close(0xc8201a4b00, 0x0, 0x0)
/home/jenkins/workspace/Filebeat/src/github.com/elastic/beats/libbeat/outputs/logstash/logstash.go:158 +0x3f
github.com/elastic/beats/libbeat/publisher.(*outputWorker).onStop(0xc820294140)
/home/jenkins/workspace/Filebeat/src/github.com/elastic/beats/libbeat/publisher/output.go:60 +0x48
github.com/elastic/beats/libbeat/publisher.(*messageWorker).shutdown(0xc820294140)
/home/jenkins/workspace/Filebeat/src/github.com/elastic/beats/libbeat/publisher/worker.go:75 +0x3a
panic(0xa2cec0, 0xc8200100d0)
/home/jenkins/tools/org.jenkinsci.plugins.golang.GolangInstallation/1.6/src/runtime/panic.go:426 +0x4f2
github.com/elastic/beats/libbeat/outputs/logstash.(*logstash).BulkPublish(0xc8201a4b00, 0xfffffd7fff038808, 0xc820461398, 0x1, 0xc8205d2000, 0x7e4, 0x800, 0x0, 0x0)
/home/jenkins/workspace/Filebeat/src/github.com/elastic/beats/libbeat/outputs/logstash/logstash.go:179 +0x8d
github.com/elastic/beats/libbeat/publisher.(*outputWorker).sendBulk(0xc820294140, 0xc820645d18, 0xc8205d2000, 0x7e4, 0x800)
/home/jenkins/workspace/Filebeat/src/github.com/elastic/beats/libbeat/publisher/output.go:111 +0x190
github.com/elastic/beats/libbeat/publisher.(*outputWorker).onBulk(0xc820294140, 0xc820645d18, 0xc8205d2000, 0x7e4, 0x800)
/home/jenkins/workspace/Filebeat/src/github.com/elastic/beats/libbeat/publisher/output.go:87 +0x39f
github.com/elastic/beats/libbeat/publisher.(*outputWorker).onMessage(0xc820294140, 0xc820288700, 0x101, 0xfffffd7fff038808, 0xc820461398, 0x0, 0xc8205d2000, 0x7e4, 0x800)
/home/jenkins/workspace/Filebeat/src/github.com/elastic/beats/libbeat/publisher/output.go:70 +0x85
github.com/elastic/beats/libbeat/publisher.(*messageWorker).onEvent(0xc820294140, 0xc820288700, 0x101, 0xfffffd7fff038808, 0xc820461398, 0x0, 0xc8205d2000, 0x7e4, 0x800)
/home/jenkins/workspace/Filebeat/src/github.com/elastic/beats/libbeat/publisher/worker.go:83 +0x62
github.com/elastic/beats/libbeat/publisher.(*messageWorker).run(0xc820294140)
/home/jenkins/workspace/Filebeat/src/github.com/elastic/beats/libbeat/publisher/worker.go:69 +0x1e2
created by github.com/elastic/beats/libbeat/publisher.(*messageWorker).init
/home/jenkins/workspace/Filebeat/src/github.com/elastic/beats/libbeat/publisher/worker.go:57 +0x11e

That's weird. From trace it looks like beats is going down.

Also I don't see any filebeat logs in here. Can you run filebeat with -e -v -d '*' to print all debug logs to console?

./filebeat-solaris-amd64 -e -v -d '*' -c filebeat.yml 
2016/08/11 16:57:51.719438 beat.go:263: INFO Home path: [/opt/gl-filebeat/bin] Config path: [/opt/gl-filebeat/bin] Data path: [/opt/gl-filebeat/bin/data] Logs path: [/opt/gl-filebeat/bin/logs]
2016/08/11 16:57:51.719502 beat.go:174: INFO Setup Beat: filebeat; Version: 5.0.0-alpha5-git32446ed
2016/08/11 16:57:51.719525 processor.go:43: DBG  Processors: 
2016/08/11 16:57:51.719530 logp.go:210: INFO Metrics logging every 30s
2016/08/11 16:57:51.719553 beat.go:180: DBG  Initializing output plugins
2016/08/11 16:57:51.719682 logstash.go:96: INFO Max Retries set to: 3
2016/08/11 16:57:51.719726 outputs.go:87: INFO Activated logstash as output plugin.
2016/08/11 16:57:51.719742 publish.go:245: DBG  Create output worker
2016/08/11 16:57:51.719810 publish.go:287: DBG  No output is defined to store the topology. The server fields might not be filled.
2016/08/11 16:57:51.719896 publish.go:301: INFO Publisher name: dal-quimby-app1.stg0
2016/08/11 16:57:51.719925 async.go:63: INFO Flush Interval set to: 1s
2016/08/11 16:57:51.719943 async.go:64: INFO Max Bulk Size set to: 2048
2016/08/11 16:57:51.719961 async.go:72: DBG  create bulk processing worker (interval=1s, bulk size=2048)
2016/08/11 16:57:51.720132 beat.go:204: INFO filebeat start running.
2016/08/11 16:57:51.720245 registrar.go:61: INFO Registry file set to: /opt/gl-filebeat/.gl-filebeat-registry
2016/08/11 16:57:51.720331 registrar.go:94: INFO Loading registrar data from /opt/gl-filebeat/.gl-filebeat-registry
2016/08/11 16:57:51.720399 registrar.go:127: DBG  Error decoding old state: json: cannot unmarshal array into Go value of type map[string]file.State
2016/08/11 16:57:51.720433 registrar.go:111: INFO States Loaded from registrar: 0
2016/08/11 16:57:51.720466 crawler.go:35: INFO Loading Prospectors: 1
2016/08/11 16:57:51.720525 spooler.go:67: INFO Starting spooler: spool_size: 2048; idle_timeout: 5s
2016/08/11 16:57:51.720589 publish.go:101: INFO Start sending events to output
2016/08/11 16:57:51.720562 prospector_log.go:38: DBG  exclude_files: []
2016/08/11 16:57:51.720666 prospector_log.go:40: INFO Load previous states from registry into memory
2016/08/11 16:57:51.720723 prospector_log.go:53: INFO Previous states loaded: 0
2016/08/11 16:57:51.720911 prospector.go:55: DBG  File Configs: [/var/svc/log/site-app-version* /var/svc/log/site-merch* /var/svc/log/application-filebeat*]
2016/08/11 16:57:51.720936 crawler.go:47: INFO Loading Prospectors completed. Number of prospectors: 1
2016/08/11 16:57:51.720954 crawler.go:62: INFO All prospectors are initialised and running with 0 states to persist
2016/08/11 16:57:51.720977 crawler.go:57: DBG  Starting prospector 0
2016/08/11 16:57:51.721054 prospector.go:94: INFO Starting prospector of type: log
2016/08/11 16:57:51.721137 prospector_log.go:57: DBG  Start next scan
2016/08/11 16:57:51.720577 registrar.go:173: INFO Starting Registrar
2016/08/11 16:57:51.722301 prospector_log.go:140: DBG  Check file for harvesting: /var/svc/log/application-filebeat:filebeat.log
2016/08/11 16:57:51.722343 prospector_log.go:159: DBG  Start harvester for new file: /var/svc/log/application-filebeat:filebeat.log
2016/08/11 16:57:51.722409 prospector_log.go:140: DBG  Check file for harvesting: /var/svc/log/application-filebeat:filebeat.log.1
2016/08/11 16:57:51.722428 prospector_log.go:159: DBG  Start harvester for new file: /var/svc/log/application-filebeat:filebeat.log.1
2016/08/11 16:57:51.722496 prospector_log.go:140: DBG  Check file for harvesting: /var/svc/log/application-filebeat:filebeat.log.2
2016/08/11 16:57:51.722516 prospector_log.go:159: DBG  Start harvester for new file: /var/svc/log/application-filebeat:filebeat.log.2
2016/08/11 16:57:51.722530 log.go:228: DBG  Setting offset for file based on seek: /var/svc/log/application-filebeat:filebeat.log
2016/08/11 16:57:51.722571 log.go:207: DBG  Setting offset for file: /var/svc/log/application-filebeat:filebeat.log. Offset: 0
2016/08/11 16:57:51.722581 prospector_log.go:140: DBG  Check file for harvesting: /var/svc/log/site-app-version:default1.log.2
2016/08/11 16:57:51.722604 prospector_log.go:159: DBG  Start harvester for new file: /var/svc/log/site-app-version:default1.log.2
2016/08/11 16:57:51.722655 prospector_log.go:140: DBG  Check file for harvesting: /var/svc/log/site-app-version:default1.log.6
2016/08/11 16:57:51.722688 log.go:228: DBG  Setting offset for file based on seek: /var/svc/log/application-filebeat:filebeat.log.2
2016/08/11 16:57:51.722709 log.go:207: DBG  Setting offset for file: /var/svc/log/application-filebeat:filebeat.log.2. Offset: 0 
2016/08/11 16:57:51.722721 log.go:45: INFO Harvester started for file: /var/svc/log/application-filebeat:filebeat.log.2
2016/08/11 16:57:51.722574 log.go:228: DBG  Setting offset for file based on seek: /var/svc/log/application-filebeat:filebeat.log.1
2016/08/11 16:57:51.722757 log.go:234: DBG  Update state: /var/svc/log/application-filebeat:filebeat.log.2, offset: 0
2016/08/11 16:57:51.722798 prospector_log.go:159: DBG  Start harvester for new file: /var/svc/log/site-app-version:default1.log.6
2016/08/11 16:57:51.722817 log.go:207: DBG  Setting offset for file: /var/svc/log/application-filebeat:filebeat.log.1. Offset: 0 
2016/08/11 16:57:51.722983 log.go:45: INFO Harvester started for file: /var/svc/log/application-filebeat:filebeat.log.1
2016/08/11 16:57:51.722917 state.go:64: DBG  New state added for /var/svc/log/application-filebeat:filebeat.log.2
2016/08/11 16:57:51.722593 log.go:45: INFO Harvester started for file: /var/svc/log/application-filebeat:filebeat.log
2016/08/11 16:57:51.723051 prospector_log.go:140: DBG  Check file for harvesting: /var/svc/log/site-app-version:preview1.log.0
2016/08/11 16:57:51.723247 prospector_log.go:159: DBG  Start harvester for new file: /var/svc/log/site-app-version:preview1.log.0
2016/08/11 16:57:51.723358 log.go:234: DBG  Update state: /var/svc/log/application-filebeat:filebeat.log, offset: 0
2016/08/11 16:57:51.723496 log.go:228: DBG  Setting offset for file based on seek: /var/svc/log/site-app-version:preview1.log.0
2016/08/11 16:57:51.723534 log.go:207: DBG  Setting offset for file: /var/svc/log/site-app-version:preview1.log.0. Offset: 0 
2016/08/11 16:57:51.723550 log.go:45: INFO Harvester started for file: /var/svc/log/site-app-version:preview1.log.0
2016/08/11 16:57:51.723598 log.go:228: DBG  Setting offset for file based on seek: /var/svc/log/site-app-version:default1.log.6
2016/08/11 16:57:51.723362 prospector_log.go:140: DBG  Check file for harvesting: /var/svc/log/site-app-version:preview1.log.1
2016/08/11 16:57:51.722931 log.go:228: DBG  Setting offset for file based on seek: /var/svc/log/site-app-version:default1.log.2
2016/08/11 16:57:51.723806 log.go:207: DBG  Setting offset for file: /var/svc/log/site-app-version:default1.log.6. Offset: 0 
2016/08/11 16:57:51.723864 log.go:45: INFO Harvester started for file: /var/svc/log/site-app-version:default1.log.6
2016/08/11 16:57:51.723061 log.go:234: DBG  Update state: /var/svc/log/application-filebeat:filebeat.log.1, offset: 0
2016/08/11 16:57:51.723909 log.go:234: DBG  Update state: /var/svc/log/site-app-version:preview1.log.0, offset: 0
2016/08/11 16:57:51.723653 state.go:64: DBG  New state added for /var/svc/log/application-filebeat:filebeat.log
2016/08/11 16:57:51.723984 state.go:64: DBG  New state added for /var/svc/log/application-filebeat:filebeat.log.1
2016/08/11 16:57:51.724012 state.go:64: DBG  New state added for /var/svc/log/site-app-version:preview1.log.0
2016/08/11 16:57:51.723852 log.go:207: DBG  Setting offset for file: /var/svc/log/site-app-version:default1.log.2. Offset: 0 
2016/08/11 16:57:51.724267 log.go:45: INFO Harvester started for file: /var/svc/log/site-app-version:default1.log.2
2016/08/11 16:57:51.724257 prospector_log.go:159: DBG  Start harvester for new file: /var/svc/log/site-app-version:preview1.log.1
2016/08/11 16:57:51.724451 prospector_log.go:140: DBG  Check file for harvesting: /var/svc/log/site-app-version:preview1.log.6
2016/08/11 16:57:51.724570 prospector_log.go:159: DBG  Start harvester for new file: /var/svc/log/site-app-version:preview1.log.6
2016/08/11 16:57:51.724697 log.go:228: DBG  Setting offset for file based on seek: /var/svc/log/site-app-version:preview1.log.1
2016/08/11 16:57:51.724736 prospector_log.go:140: DBG  Check file for harvesting: /var/svc/log/site-app-version:preview1.log.7
2016/08/11 16:57:51.724904 log.go:228: DBG  Setting offset for file based on seek: /var/svc/log/site-app-version:preview1.log.6
2016/08/11 16:57:51.724934 prospector_log.go:159: DBG  Start harvester for new file: /var/svc/log/site-app-version:preview1.log.7
2016/08/11 16:57:51.725103 prospector_log.go:140: DBG  Check file for harvesting: /var/svc/log/application-filebeat:filebeat.log.0
2016/08/11 16:57:51.725209 prospector_log.go:159: DBG  Start harvester for new file: /var/svc/log/application-filebeat:filebeat.log.0

Steffens - Thanks for looking into it. It is weird i dont see anything in the console output but finally after flushing all the events the beats just goes down

2016/08/11 16:57:51.735960 state.go:64: DBG  New state added for /var/svc/log/site-app-version:preview1.log
2016/08/11 16:57:51.735924 log.go:234: DBG  Update state: /var/svc/log/site-app-version:preview1.log.2, offset: 0
2016/08/11 16:57:51.736052 state.go:64: DBG  New state added for /var/svc/log/site-app-version:preview1.log.2
2016/08/11 16:57:51.752886 spooler.go:115: DBG  Flushing spooler because spooler full. Events flushed: 2048
2016/08/11 16:57:51.766834 client.go:180: DBG  Publish:
{
  "@timestamp": "2016-08-11T16:57:51.752Z",
  "beat": {
    "hostname": "app1.stg0",
    "name": "app1.stg0"
  },
  "input_type": "log",
  "message": "{\"name\":\"app-version-service\",\"hostname\":\"app1.stg0\",\"pid\":16212,\"level\":30,\"msg\":\"Process cache update complete.\",\"time\":\"2016-08-04T07:16:22.858Z\",\"v\":1,\"tags\":{\"app-version-state\":true,\"debug\":true},\"event\":{\"timestamp\":1470294982858,\"tags\":[\"app-version-state\",\"debug\"],\"data\":\"Process cache update complete.\"}}",
  "offset": 25818,
  "source": "/var/svc/log/site-app-version:default1.log.6",
  "type": "log"
}
2016/08/11 16:57:51.983909 output.go:108: DBG  output worker: publish 2018 events
panic: runtime error: invalid memory address or nil pointer dereference
        panic: runtime error: invalid memory address or nil pointer dereference
[signal 0xb code=0x1 addr=0x20 pc=0x62bd6f]

goroutine 6 [running]:
panic(0xa2cec0, 0xc8200120e0)
        /home/jenkins/tools/org.jenkinsci.plugins.golang.GolangInstallation/1.6/src/runtime/panic.go:464 +0x3ef
github.com/elastic/beats/libbeat/outputs/logstash.(*logstash).Close(0xc82000e520, 0x0, 0x0)
        /home/jenkins/workspace/Filebeat/src/github.com/elastic/beats/libbeat/outputs/logstash/logstash.go:158 +0x3f
github.com/elastic/beats/libbeat/publisher.(*outputWorker).onStop(0xc82008c460)
        /home/jenkins/workspace/Filebeat/src/github.com/elastic/beats/libbeat/publisher/output.go:60 +0x48
github.com/elastic/beats/libbeat/publisher.(*messageWorker).shutdown(0xc82008c460)
        /home/jenkins/workspace/Filebeat/src/github.com/elastic/beats/libbeat/publisher/worker.go:75 +0x3a
panic(0xa2cec0, 0xc8200120e0)
        /home/jenkins/tools/org.jenkinsci.plugins.golang.GolangInstallation/1.6/src/runtime/panic.go:4

these traces do not make much sense to me. Seems like logstash output mode is not configured correctly. I just don't know how this can happen.

The config file posted looks kind of weird regarding indentation. Very inconsistent, plus not sure logstash config is correct. Sample is:

output:
  logstash:
  hosts: [_SERVER_HOST]
#  worker: 2
#  max_retries: -1
#  bulk_max_size: 2048

with beats configs being sensitive to indentation (number of spaces/tabs being used), make sure to not use tabs at all and indent by 2 spaces.

normally it should look like:

output:
  logstash:
    hosts: ["_SERVER_HOST"]
    #worker: 2
    #max_retries: -1
    #bulk_max_size: 2048

Normally this should be caught by startup routine. Can you try to clean up your config by consistently indenting with 2 space characters (a number of other options seem out of place) and see if this solves the problem. If we can verify it's working with proper config file, I at least have some hints looking for a potential code-fix.

1 Like

Steffens - You are bang on, it does work with proper config file. Appreciate all your help guys!

Great its working. Thanks for reporting back. This is an error that should have been caught on startup. Will try to fix this soonish.

This topic was automatically closed after 21 days. New replies are no longer allowed.