Add a new data to the filebeat detection log file,Filebeat debugging information output is as follows:
2018-07-06T15:12:21.543+0800 DEBUG [publish] pipeline/processor.go:291 Publish event: {
"@timestamp": "2018-07-06T07:12:16.327Z",
"@metadata": {
"beat": "filebeat",
"type": "doc",
"version": "6.3.0"
},
"input": {
"type": "log"
},
"prospector": {
"type": "log"
},
"beat": {
"name": "server2",
"hostname": "server2",
"version": "6.3.0"
},
"host": {
"name": "server2"
},
"source": "/home/testlog/taw.log",
"offset": 380,
"message": "recorder=ads vsid=2 sub_type=attacklog dst_addr=123.125.127.183 \nid=tos time=\"2017-08-16 16:58:50\" fw=IPS01.PUB.BEIJING-B pri=6 type=mgmt user=superman src=10.3.42.19 op=\"network interface eth13 show\" result=0 recorder=config msg=\"nuwwwww\"\nfw=IPS01.PUB.BEIJING-B pri=6 type=mgmt user=superman src=10.3.42.1 op=\"network interface eth13 show\" result=0 recorder=config msg=\"nuw\""
}
2018-07-06T15:12:22.970+0800 ERROR logstash/async.go:235 Failed to publish events caused by: write tcp 192.168.33.212:51106->192.168.33.85:5044: write: connection reset by peer
2018-07-06T15:12:23.971+0800 ERROR pipeline/output.go:92 Failed to publish events: write tcp 192.168.33.212:51106->192.168.33.85:5044: write: connection reset by peer
Logstash reports to elasticsearch twice in a row:
{
"source" => "/home/testlog/taw.log",
"sub_type" => "attacklog",
"tags" => [
[0] "beats_input_codec_plain_applied"
],
"host" => "server2",
"dst_addr" => "123.125.127.183",
"offset" => 380,
"show" => "This data is the test data",
"prospector" => {
"type" => "log"
},
"@timestamp" => 2018-07-06T07:12:16.322Z,
"input" => {
"type" => "log"
},
"@version" => "1",
"message" => "recorder=ads vsid=2 sub_type=attacklog dst_addr=123.125.127.183 \nid=tos time=\"2017-08-16 16:58:50\" fw=IPS01.PUB.BEIJING-B pri=6 type=mgmt user=superman src=10.3.42.19 op=\"network interface eth13 show\" result=0 recorder=config msg=\"nuwwwww\"\nfw=IPS01.PUB.BEIJING-B pri=6 type=mgmt user=superman src=10.3.42.1 op=\"network interface eth13 show\" result=0 recorder=config msg=\"nuw\"",
"type" => "replace_test",
"beat" => {
"version" => "6.3.0",
"name" => "server2",
"hostname" => "server2"
}
}
{
"source" => "/home/testlog/taw.log",
"sub_type" => "attacklog",
"tags" => [
[0] "beats_input_codec_plain_applied"
],
"host" => "server2",
"dst_addr" => "123.125.127.183",
"offset" => 380,
"show" => "This data is the test data",
"prospector" => {
"type" => "log"
},
"@timestamp" => 2018-07-06T07:12:16.327Z,
"input" => {
"type" => "log"
},
"@version" => "1",
"message" => "recorder=ads vsid=2 sub_type=attacklog dst_addr=123.125.127.183 \nid=tos time=\"2017-08-16 16:58:50\" fw=IPS01.PUB.BEIJING-B pri=6 type=mgmt user=superman src=10.3.42.19 op=\"network interface eth13 show\" result=0 recorder=config msg=\"nuwwwww\"\nfw=IPS01.PUB.BEIJING-B pri=6 type=mgmt user=superman src=10.3.42.1 op=\"network interface eth13 show\" result=0 recorder=config msg=\"nuw\"",
"type" => "replace_test",
"beat" => {
"version" => "6.3.0",
"name" => "server2",
"hostname" => "server2"
}
}
I don't know why, and I didn't set a copy, it's a single node.