Hello,
I'm trying to get Packetbeat to add the user's IP address to the incoming payload, and then submit that to ES. I'm running ES and I can see the data that's being submitted (it's all correct). I then ran Packetbeat, config'ed it, and ran it. Subsequently, the data submitted does not include the user's IP address. I connected with Kibana to make sure I wasn't missing anything. The user's IP is not included.
Is this even possible? Do I have to use Logstash between ES and Packetbeat?
My app submits HTTP requests to http://10.223.24.180:9200 (local host).
One thing that I'm confused about is: should my app submit to the ES port? Or should it submit to an HTTP port? Which Packetbeat listens to, then sends that data to the ES port?
My ES config is only this:
network.host: 10.223.24.180
http.port: 9200
My Packetbeat config:
packetbeat.interfaces.device: 0
packetbeat.interfaces.with_vlans: true
packetbeat.interfaces.type: pcap
packetbeat.flows:
timeout: 30s
period: 10s
enabled: false
packetbeat.protocols.icmp:
enabled: false
packetbeat.protocols.amqp:
ports: [5672]
enabled: false
packetbeat.protocols.cassandra:
ports: [9042]
enabled: false
packetbeat.protocols.dns:
ports: [53]
enabled: false
include_authorities: true
include_additionals: true
packetbeat.protocols.http:
ports: [80, 8080, 8081, 5000, 8002]
send_request : true
send_response : true
send_all_headers: true
include_body_for: ["text/html", "application/json"]
packetbeat.protocols.memcache:
ports: [11211]
enabled: false
packetbeat.protocols.mysql:
ports: [3306]
enabled: false
packetbeat.protocols.pgsql:
ports: [5432]
enabled: false
packetbeat.protocols.redis:
ports: [6379]
enabled: false
packetbeat.protocols.thrift:
ports: [9090]
enabled: false
packetbeat.protocols.mongodb:
ports: [27017]
enabled: false
packetbeat.protocols.nfs:
ports: [2049]
enabled: false
output.file:
path: "/tmp/packetbeat"
filename: packetbeat
output.elasticsearch:
hosts: ["http://10.223.24.180:9200"]
protocol: "http"
processors:
- include_fields:
fields:
- ip
- client_ip
path: "/elasticsearch"
template.enabled: true
template.path: "packetbeat.template-es2x.json"
template.overwrite: false
This is the debug output for Packetbeat, as it starts:
2017/02/15 18:39:41.292095 beat.go:267: INFO Home path: [C:\Program Files\packetbeat] Config path: [C:\Program Files\packetbeat] Data path: [C:\Program Files\packetbeat\data] Logs path: [C:\Program Files\packetbeat\logs]
2017/02/15 18:39:41.292095 beat.go:177: INFO Setup Beat: packetbeat; Version: 5.2.0
2017/02/15 18:39:41.292095 logp.go:219: INFO Metrics logging every 30s
2017/02/15 18:39:41.292095 file.go:45: INFO File output path set to: /tmp/packetbeat
2017/02/15 18:39:41.292095 file.go:46: INFO File output base filename set to: packetbeat
2017/02/15 18:39:41.292095 file.go:49: INFO Rotate every bytes set to: 10485760
2017/02/15 18:39:41.292095 file.go:53: INFO Number of files set to: 7
2017/02/15 18:39:41.292095 outputs.go:106: INFO Activated file as output plugin.
2017/02/15 18:39:41.292095 output.go:167: INFO Loading template enabled. Reading template file: C:\Program Files\packetbeat\packetbeat.template-es2x.json
2017/02/15 18:39:41.293095 output.go:178: INFO Loading template enabled for Elasticsearch 2.x. Reading template file: C:\Program Files\packetbeat\packetbeat.template-es2x.json
2017/02/15 18:39:41.294096 client.go:120: INFO Elasticsearch url: http://10.223.24.180:9200/elasticsearch
2017/02/15 18:39:41.294096 outputs.go:106: INFO Activated elasticsearch as output plugin.
2017/02/15 18:39:41.294096 publish.go:234: DBG Create output worker
2017/02/15 18:39:41.295096 publish.go:234: DBG Create output worker
2017/02/15 18:39:41.295096 publish.go:276: DBG No output is defined to store the topology. The server fields might not be filled.
2017/02/15 18:39:41.295096 publish.go:291: INFO Publisher name: bobpur-3358
2017/02/15 18:39:41.297098 async.go:63: INFO Flush Interval set to: -1s
2017/02/15 18:39:41.297098 async.go:64: INFO Max Bulk Size set to: -1
2017/02/15 18:39:41.297098 async.go:63: INFO Flush Interval set to: 1s
2017/02/15 18:39:41.297098 async.go:64: INFO Max Bulk Size set to: 50
2017/02/15 18:39:41.297098 async.go:72: DBG create bulk processing worker (interval=1s, bulk size=50)
2017/02/15 18:39:41.297098 procs.go:79: INFO Process matching disabled
2017/02/15 18:39:41.298098 protos.go:89: INFO registered protocol plugin: cassandra
2017/02/15 18:39:41.298098 protos.go:89: INFO registered protocol plugin: mysql
2017/02/15 18:39:41.298098 protos.go:89: INFO registered protocol plugin: thrift
2017/02/15 18:39:41.298098 protos.go:89: INFO registered protocol plugin: pgsql
2017/02/15 18:39:41.298098 protos.go:89: INFO registered protocol plugin: redis
2017/02/15 18:39:41.298098 protos.go:89: INFO registered protocol plugin: amqp
2017/02/15 18:39:41.298098 protos.go:89: INFO registered protocol plugin: dns
2017/02/15 18:39:41.298098 protos.go:89: INFO registered protocol plugin: http
2017/02/15 18:39:41.298098 protos.go:89: INFO registered protocol plugin: memcache
2017/02/15 18:39:41.298098 protos.go:89: INFO registered protocol plugin: mongodb
2017/02/15 18:39:41.298098 protos.go:89: INFO registered protocol plugin: nfs
2017/02/15 18:39:41.298098 protos.go:111: INFO Protocol plugin 'redis' disabled by config
2017/02/15 18:39:41.298098 protos.go:111: INFO Protocol plugin 'cassandra' disabled by config
2017/02/15 18:39:41.298098 protos.go:111: INFO Protocol plugin 'pgsql' disabled by config
2017/02/15 18:39:41.298098 protos.go:111: INFO Protocol plugin 'mysql' disabled by config
2017/02/15 18:39:41.298098 protos.go:111: INFO Protocol plugin 'nfs' disabled by config
2017/02/15 18:39:41.299099 protos.go:111: INFO Protocol plugin 'memcache' disabled by config
2017/02/15 18:39:41.299099 protos.go:111: INFO Protocol plugin 'thrift' disabled by config
2017/02/15 18:39:41.300099 protos.go:111: INFO Protocol plugin 'amqp' disabled by config
2017/02/15 18:39:41.300099 protos.go:111: INFO Protocol plugin 'dns' disabled by config
2017/02/15 18:39:41.300099 protos.go:111: INFO Protocol plugin 'mongodb' disabled by config
2017/02/15 18:39:41.301100 sniffer.go:270: DBG BPF filter: 'tcp port 80 or tcp port 8080 or tcp port 8081 or tcp port 5000 or tcp port 8002'
2017/02/15 18:39:41.313109 sniffer.go:145: INFO Resolved device index 0 to device: \Device\NPF_{0986E0A4-99DD-4410-9A76-B4E9F2AA8AE1}
2017/02/15 18:39:41.314109 sniffer.go:156: DBG Sniffer type: pcap device: \Device\NPF_{0986E0A4-99DD-4410-9A76-B4E9F2AA8AE1}
2017/02/15 18:39:41.318113 beat.go:207: INFO packetbeat start running.
2017/02/15 18:39:41.818828 sniffer.go:322: DBG Interrupted
And it continues with the last line, until I stop it. With the occasional
2017/02/15 18:40:11.292374 logp.go:232: INFO No non-zero metrics in the last 30s
Thank you for reading.