Filebeat large files

Using Filebeats 1.0.0 but have tested using the nightly release of 1.2 but doesn't seem to have fixed the issue.

Have a bit of an issue when a file gets over 2Gb where the log file no longer seems to be processed. The other files that it is watching are still continuing to be processed.

Any ideas?

The offset in the registry file seems to still increase, but the data doesn't go through.

Might have actually been an issue with elastic search. Still investigating that though.

you send data directly to elasticsearch or via logstash? What's the mapping of your index?

Sending it to logstash. The mapping is what we had created within logstash. Seems to be going through OK now, think the actual problem was with elastic search, and at this stage no idea what or why.