I'm receiving the following warning in my Logstash container. I think it's because there may be some malformed/corrupt data coming through.
[WARN ] 2021-06-25 20:38:37.941 [[test]>worker0] elasticsearch - Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"test-2021.06.25", :routing=>nil, :_type=>"_doc"}, #LogStash::Event:0x3b1b3313], :response=>{"index"=>{"_index"=>"test-2021.06.25", "_type"=>"_doc", "id"=>"EcXlRHoBEkSVatD2VBNc", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse field [fields.property-value] of type [long] in document with id 'EcXlRHoBEkSVatD2VBNc'. Preview of field's value: '18446744073689527672'", "caused_by"=>{"type"=>"input_coercion_exception", "reason"=>"Numeric value (18446744073689527672) out of range of long (-9223372036854775808 - 9223372036854775807)\n at [Source: (byte[])"\u0017\u0003\u0003\u0001l\u0000\u0000\u0000\u0000\u0000\u0000\n�\f�4�A���Pn�\u0006
#院�\u0017�����m���K\u0002���Q\u0013]\u007F�\u0002��R~�V��V\v�nC�Ϭ\u0004�c���\u0012<�o�,�\e{��|�\u0017\u0015�3�\u000F�֟7�\u0019\u0001��\u0002�1\u0015\u0010\u0019K_\f�]\u0019t\u0013Ht��{\f����\u001A�َ\u0002�GnQ���\u0000�:��L?S1\u0001�S�G��1���>�!��S\a��Ŧw�8To�~}�����l��\u0014�-&\u0000�a;^DJ����_be�(�\u007Fsr��\u0000a����V\u001DԪ;�挼R���t4�-��_�N��e�]?�7���?��\u0005[��\u0016��(���*l1���J��U,�\u0016�����S�]�?M���\u001F���t~n��\u0006�M.\u0019q�D�k{\bH��7\u0000\u0016k�\u0002�q_D��\u0002�E~e��c\vGU�\u0018�\u0004����i�%}ץ��N�vȽ\u0017n\u001F��z}8t
���S�\u000FYV�j�]��@�c�f>vw<I�E��ʲX��\bgNu��c
��3$\u000Fp�U���g��~y?�E3C1��1���+�=ꕇ4�1�Kd�Q��C\u0011�\a\u001D#�9�\u001A5A^���\u0000\u0011|�K���gp��\u0016|;p�}�\u0001��"[truncated 1048076 bytes]; line: 1, column: 338]"}}}}}
I have a template applied for this index in which I'm mapping "property-value" to a "text" type since long is not valid anymore for this field.
Is there any way I can filter this data out of Logstash?