_grokparsefailure and missing events when using Logstash Forwarder

I have a strange issue in which all events sent to Logstash via Logstash Forwarder using the Lumberjack input plugin are not indexed at all (completely missing, no indexes created in ElasticSearch) and a subset of the events result in _grokparsefailure. However when the files are processed on the Logstash server locally using the File input plugin there is no issues and the indexes are created in ElasticSearch.

Input and output configurations are:

input {
 # lumberjack {
 # port => 6782
 # ssl_certificate => '/etc/pki/tls/certs/logstash-forwarder.crt'
 # ssl_key => '/etc/pki/tls/private/logstash-forwarder.key'
 # }

  file {
    path => '/tmp/all-fuse-logs/*'
    start_position => 'beginning'
    type => 'karaf'
    sincedb_path => '/tmp/csms.sincedb'
  }
}
elasticsearch { # Store event in datastore.
    host => 'abc-log.def.ghi'
    index => "logstash-%{+YYYY.MM.dd}-%{host}-%{type}"
}

Has anyone any idea what is going on and how I may proceed?

Thanks,

Comment out the ES output and use a straight stdout { codec => rubydebug } to understand more of what's happening.

Hi Magnus, also note that when I alter the matching pattern to:

match => [ 'message', "%{GREEDYDATA:text}" ]

I still receive _grokparsefailure when using the Logstash Forwarder shipper.
I will give your suggestion a try.