The data is sent to elasticsearch via logstash, the data contains escape characters, and the elasticsearc error after receiving the data:
This is the data sent by logstash,
{
"message" => "id="ngtos" version="1.0" time="2017-05-08 01:07:53" dev="1235668" pri="6" type="ddos_clean" recorder="ads" vsid="0" sub_type=attacklog dst_addr=1.1.1.1 zonename=1504152157558 grpname=test attack_status=begin src_addr=128.18.74.44;128.19.75.33 service= protocol_4=TCP dst_port=5060 attack_type="http-flood" defense_method="http-source-auth" cur_cfg_value=1 cfg_value_unit=pps total_packets=1500 attack_packets=1000 total_bytes=12590 attack_bytes=1240 action=drop attack_msgs="http-flood" backup1=0 backup2= backup3= client_addr=1.1.1.1 ip_flow_b=2000 cut_ip_flow_b=500 tcp_flow_b=2000 cut_tcp_flow_b=200 dns_flow_b=1000 cut_dns_flow_b=100 http_flow_b=2000 cut_http_flow_b=200 sip_flow_b=3000 cut_sip_flow_b=300 ipfrag_flow_b=2000 cut_ipfrag_flow_b=100",
"offset" => 751,
"prospector" => {
"type" => "log"
},
"@version" => "1",
"tags" => [
[0] "beats_input_codec_plain_applied",
[1] "_grokparsefailure"
],
"source" => "/home/filebeat/testlog/aads_new.log",
"@timestamp" => 2018-07-02T08:55:35.974Z,
"input" => {
"type" => "log"
},
"beat" => {
"hostname" => "n",
"name" => "n",
"version" => "6.3.0"
},
"host" => {
"name" => "n"
}
}
this is the error message reported by elasticsearch after receiving it.
[2018-07-02T16:57:45,620][DEBUG][o.e.a.b.TransportShardBulkAction] [logstash-2018.07.02][2] failed to execute bulk item (index) BulkShardRequest [[logstash-2018.07.02][2]] containing [index {[logstash-2018.07.02][doc][GL42WmQBBKeqWE53TJHN], source[{"message":"id=\"ngtos\" version=\"1.0\" time=\"2017-05-08 01:07:53\" dev=\"1235668\" pri=\"6\" type=\"ddos_clean\" recorder=\"ads\" vsid=\"0\" sub_type=attacklog dst_addr=1.1.1.1 zonename=1504152157558 grpname=test attack_status=begin src_addr=128.18.74.44;128.19.75.33 service= protocol_4=TCP dst_port=5060 attack_type=\"http-flood\" defense_method=\"http-source-auth\" cur_cfg_value=1 cfg_value_unit=pps total_packets=1500 attack_packets=1000 total_bytes=12590 attack_bytes=1240 action=drop attack_msgs=\"http-flood\" backup1=0 backup2= backup3= client_addr=1.1.1.1 ip_flow_b=2000 cut_ip_flow_b=500 tcp_flow_b=2000 cut_tcp_flow_b=200 dns_flow_b=1000 cut_dns_flow_b=100 http_flow_b=2000 cut_http_flow_b=200 sip_flow_b=3000 cut_sip_flow_b=300 ipfrag_flow_b=2000 cut_ipfrag_flow_b=100","offset":751,"prospector":{"type":"log"},"@version":"1","tags":["beats_input_codec_plain_applied","_grokparsefailure"],"source":"/home/filebeat/testlog/aads_new.log","@timestamp":"2018-07-02T08:55:35.974Z","input":{"type":"log"},"beat":{"hostname":"n","name":"n","version":"6.3.0"},"host":{"name":"n"}}]}]
org.elasticsearch.index.mapper.MapperParsingException: failed to parse [host]
at org.elasticsearch.index.mapper.FieldMapper.parse(FieldMapper.java:302) ~[elasticsearch-6.3.0.jar:6.3.0]
at org.elasticsearch.index.mapper.DocumentParser.parseObjectOrField(DocumentParser.java:481) ~[elasticsearch-6.3.0.jar:6.3.0]
at org.elasticsearch.index.mapper.DocumentParser.parseObject(DocumentParser.java:496) ~[elasticsearch-6.3.0.jar:6.3.0]
at org.elasticsearch.index.mapper.DocumentParser.innerParseObject(DocumentParser.java:390) ~[elasticsearch-6.3.0.jar:6.3.0]
at org.elasticsearch.index.mapper.DocumentParser.parseObjectOrNested(DocumentParser.java:380) ~[elasticsearch-6.3.0.jar:6.3.0]
at org.elasticsearch.index.mapper.DocumentParser.internalParseDocument(DocumentParser.java:95) ~[elasticsearch-6.3.0.jar:6.3.0]
.......