KV field_split prevents logstash ingesting data

This one might be easier:

Comma separated ON

kv {

value_split => "="
field_split => ","

[2017-09-19T16:58:52,098][DEBUG][logstash.pipeline        ] filter received {"event"=>{"@timestamp"=>2017-09-19T07:58:52.087Z, "@version"=>"1", "host"=>"1.1.1.1", "message"=>"<132>date=2017-09-19,time=08:58:44,devname=fortitest,device_id=FGT60C11111,log_id=0038000007,type=traffic,subtype=other,pri=warning,vd=root,src=10.0.0.173,src_port=51452,src_int=\"wan1\",dst=10.0.0.255,dst_port=1947,dst_int=\"root\",SN=212789,status=deny,policyid=0,dst_country=\"Reserved\",src_country=\"Reserved\",service=1947/udp,proto=17,duration=349781,sent=0,rcvd=0,msg=\"iprope_in_check() check failed, drop\"", "type"=>"fortigate"}}
[2017-09-19T16:58:52,100][DEBUG][logstash.filters.grok    ] Running grok filter {:event=>2017-09-19T07:58:52.087Z 1.1.1.1 <132>date=2017-09-19,time=08:58:44,devname=fortitest,device_id=FGT60C11111,log_id=0038000007,type=traffic,subtype=other,pri=warning,vd=root,src=10.0.0.173,src_port=51452,src_int="wan1",dst=10.0.0.255,dst_port=1947,dst_int="root",SN=212789,status=deny,policyid=0,dst_country="Reserved",src_country="Reserved",service=1947/udp,proto=17,duration=349781,sent=0,rcvd=0,msg="iprope_in_check() check failed, drop"}
[2017-09-19T16:58:52,105][DEBUG][logstash.filters.grok    ] Event now:  {:event=>2017-09-19T07:58:52.087Z 1.1.1.1 date=2017-09-19,time=08:58:44,devname=fortitest,device_id=FGT60C11111,log_id=0038000007,type=traffic,subtype=other,pri=warning,vd=root,src=10.0.0.173,src_port=51452,src_int="wan1",dst=10.0.0.255,dst_port=1947,dst_int="root",SN=212789,status=deny,policyid=0,dst_country="Reserved",src_country="Reserved",service=1947/udp,proto=17,duration=349781,sent=0,rcvd=0,msg="iprope_in_check() check failed, drop"}
[2017-09-19T16:58:52,108][DEBUG][logstash.pipeline        ] output received {"event"=>{"date"=>"2017-09-19", "src_int"=>"wan1", "msg"=>"iprope_in_check() check failed, drop", "dst"=>"10.0.0.255", "type"=>"traffic", "dst_int"=>"root", "duration"=>"349781", "policyid"=>"0", "subtype"=>"other", "syslog5424_pri"=>"132", "@version"=>"1", "host"=>"1.1.1.1", "devname"=>"fortitest", "SN"=>"212789", "dst_country"=>"Reserved", "log_id"=>"0038000007", "device_id"=>"FGT60C11111", "src"=>"10.0.0.173", "pri"=>"warning", "rcvd"=>"0", "message"=>"date=2017-09-19,time=08:58:44,devname=fortitest,device_id=FGT60C11111,log_id=0038000007,type=traffic,subtype=other,pri=warning,vd=root,src=10.0.0.173,src_port=51452,src_int=\"wan1\",dst=10.0.0.255,dst_port=1947,dst_int=\"root\",SN=212789,status=deny,policyid=0,dst_country=\"Reserved\",src_country=\"Reserved\",service=1947/udp,proto=17,duration=349781,sent=0,rcvd=0,msg=\"iprope_in_check() check failed, drop\"", "sent"=>"0", "vd"=>"root", "src_port"=>"51452", "@timestamp"=>2017-09-19T07:58:52.087Z, "syslog_index"=>"<132>", "src_country"=>"Reserved", "service"=>"1947/udp", "proto"=>"17", "dst_port"=>"1947", "time"=>"08:58:44", "status"=>"deny"}}
[2017-09-19T16:58:54,621][DEBUG][logstash.pipeline        ] Pushing flush onto pipeline
[2017-09-19T16:58:59,620][DEBUG][logstash.pipeline        ] Pushing flush onto pipeline


Nothing logged to elastic.

and

Comma separated OFF

kv {

value_split => "="
field_split => ","

[2017-09-19T16:53:31,506][DEBUG][logstash.pipeline        ] filter received {"event"=>{"@timestamp"=>2017-09-19T07:53:31.490Z, "@version"=>"1", "host"=>"1.1.1.1", "message"=>"<132>date=2017-09-19 time=08:53:25 devname=fortitest device_id=FGT60C11111 log_id=0038000007 type=traffic subtype=other pri=warning vd=root src=10.0.0.208 src_port=17500 src_int=\"wan1\" dst=10.0.0.255 dst_port=17500 dst_int=\"root\" SN=212550 status=deny policyid=0 dst_country=\"Reserved\" src_country=\"Reserved\" service=17500/udp proto=17 duration=349461 sent=0 rcvd=0 msg=\"iprope_in_check() check failed, drop\"", "type"=>"fortigate"}}
[2017-09-19T16:53:31,510][DEBUG][logstash.filters.grok    ] Running grok filter {:event=>2017-09-19T07:53:31.490Z 1.1.1.1 <132>date=2017-09-19 time=08:53:25 devname=fortitest device_id=FGT60C11111 log_id=0038000007 type=traffic subtype=other pri=warning vd=root src=10.0.0.208 src_port=17500 src_int="wan1" dst=10.0.0.255 dst_port=17500 dst_int="root" SN=212550 status=deny policyid=0 dst_country="Reserved" src_country="Reserved" service=17500/udp proto=17 duration=349461 sent=0 rcvd=0 msg="iprope_in_check() check failed, drop"}
[2017-09-19T16:53:31,511][DEBUG][logstash.filters.grok    ] Event now:  {:event=>2017-09-19T07:53:31.490Z 1.1.1.1 date=2017-09-19 time=08:53:25 devname=fortitest device_id=FGT60C11111 log_id=0038000007 type=traffic subtype=other pri=warning vd=root src=10.0.0.208 src_port=17500 src_int="wan1" dst=10.0.0.255 dst_port=17500 dst_int="root" SN=212550 status=deny policyid=0 dst_country="Reserved" src_country="Reserved" service=17500/udp proto=17 duration=349461 sent=0 rcvd=0 msg="iprope_in_check() check failed, drop"}
[2017-09-19T16:53:31,512][DEBUG][logstash.pipeline        ] output received {"event"=>{"date"=>"2017-09-19 time=08:53:25 devname=fortitest device_id=FGT60C11111 log_id=0038000007 type=traffic subtype=other pri=warning vd=root src=10.0.0.208 src_port=17500 src_int=\"wan1\" dst=10.0.0.255 dst_port=17500 dst_int=\"root\" SN=212550 status=deny policyid=0 dst_country=\"Reserved\" src_country=\"Reserved\" service=17500/udp proto=17 duration=349461 sent=0 rcvd=0 msg=\"iprope_in_check() check failed", "@timestamp"=>2017-09-19T07:53:31.490Z, "syslog_index"=>"<132>", "syslog5424_pri"=>"132", "@version"=>"1", "host"=>"1.1.1.1", "message"=>"date=2017-09-19 time=08:53:25 devname=fortitest device_id=FGT60C11111 log_id=0038000007 type=traffic subtype=other pri=warning vd=root src=10.0.0.208 src_port=17500 src_int=\"wan1\" dst=10.0.0.255 dst_port=17500 dst_int=\"root\" SN=212550 status=deny policyid=0 dst_country=\"Reserved\" src_country=\"Reserved\" service=17500/udp proto=17 duration=349461 sent=0 rcvd=0 msg=\"iprope_in_check() check failed, drop\"", "type"=>"fortigate"}}
[2017-09-19T16:53:33,341][DEBUG][logstash.pipeline        ] Pushing flush onto pipeline
[2017-09-19T16:53:38,348][DEBUG][logstash.pipeline        ] Pushing flush onto pipeline
[2017-09-19T16:53:43,352][DEBUG][logstash.pipeline        ] Pushing flush onto pipeline

Everything logged to message field in elastic

Edit: The above is with an actual Fortigate sending data to logstash real time.

Edit2: Changing the test config to output to elastic instead of stdout (reading from .txt) results in the same problem.

With

value_split => "="
field_split => ","

and NO comma separated logs everything is pushed in the message field.

With

value_split => "="
field_split => ","

and comma separated values, which gives the correct results in stdout, nothing gets into elastic.