Please correct the analyzer to not produce such terms

I am getting this error from logstash, any idea how to fix it? Is there a limit how many bytes a string field can contain?
"CONTENT_BODY" : {
"index" : "not_analyzed",
"type" : "string"
},
java.lang.IllegalArgumentException: Document contains at least one immense term in field="CONTENT_BODY" (whose UTF8 encoding is longer than the max length 32766), all of which were skipped. Please correct the analyzer to not produce such terms. The prefix of the first immense term is: '[65, 116, 32, 77, 101, 114, 99, 121, 32, 77, 117, 108, 116, 105, 112, 108, 105, 101, 100, 44, 32, 116, 114, 111, 117, 98, 108, 101, 100, 32]...', original message: bytes can be at most 32766 in length; got 41259
at org.apache.lucene.index.DefaultIndexingChain$PerField.invert(DefaultIndexingChain.java:687)
at org.apache.lucene.index.DefaultIndexingChain.processField(DefaultIndexingChain.java:359)
at org.apache.lucene.index.DefaultIndexingChain.processDocument(DefaultIndexingChain.java:318)
at org.apache.lucene.index.DocumentsWriterPerThread.updateDocument(DocumentsWriterPerThread.java:241)
at org.apache.lucene.index.DocumentsWriter.updateDocument(DocumentsWriter.java:465)
at org.apache.lucene.index.IndexWriter.updateDocument(IndexWriter.java:1526)
at org.apache.lucene.index.IndexWriter.addDocument(IndexWriter.java:1252)
at org.elasticsearch.index.engine.InternalEngine.innerCreateNoLock(InternalEngine.java:345)
at org.elasticsearch.index.engine.InternalEngine.innerCreate(InternalEngine.java:287)
at org.elasticsearch.index.engine.InternalEngine.create(InternalEngine.java:259)
at org.elasticsearch.index.shard.IndexShard.create(IndexShard.java:483)
at org.elasticsearch.action.bulk.TransportShardBulkAction.shardIndexOperation(TransportShardBulkAction.java:423)
at org.elasticsearch.action.bulk.TransportShardBulkAction.shardOperationOnPrimary(TransportShardBulkAction.java:148)
at org.elasticsearch.action.support.replication.TransportShardReplicationOperationAction$PrimaryPhase.performOnPrimary(TransportShardReplicationOperationAction.java:574)
at org.elasticsearch.action.support.replication.TransportShardReplicationOperationAction$PrimaryPhase$1.doRun(TransportShardReplicationOperationAction.java:440)
at org.elasticsearch.common.util.concurrent.AbstractRunnable.run(AbstractRunnable.java:36)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: org.apache.lucene.util.BytesRefHash$MaxBytesLengthExceededException: bytes can be at most 32766 in length; got 41259
at org.apache.lucene.util.BytesRefHash.add(BytesRefHash.java:284)
at org.apache.lucene.index.TermsHashPerField.add(TermsHashPerField.java:151)
at org.apache.lucene.index.DefaultIndexingChain$PerField.invert(DefaultIndexingChain.java:663)
... 18 more

just set this can store more.

"CONTENT_BODY" :{
       "type" : "string"
}

as the "index" default "analyzed"