ParseHttp exception. Recovering, but please report this: runtime error: slice bounds out of range

Hello.
im running packetbeat; Version: 6.2.1 on ubuntu machine,
Connected to Elasticsearch version 5.1.2.
capturing mirrored traffic.

this is my yml file:

packetbeat.interfaces.device: eth5
packetbeat.interfaces.type: af_packet
packetbeat.interfaces.snaplen: 1500
packetbeat.interfaces.buffer_size_mb: 500
packetbeat.flows:
ports: [53]

include_authorities controls whether or not the dns.authorities field

(authority resource records) is added to messages.

include_authorities: true

include_additionals controls whether or not the dns.additionals field

(additional resource records) is added to messages.

include_additionals: true

  • type: http

    Configure the ports where to listen for HTTP traffic. You can disable

    the HTTP protocol by commenting out the list of ports.

    enabled: true
    ports: [80, 8080]
    send_headers: false
    send_all_headers: false
    include_body_for: ["application/json"]
    send_request: true
    send_response: true
    transaction_timeout: 20s

output.elasticsearch:

Array of hosts to connect to.

hosts: ["1.1.1.1:9200"]

most of packets are not parsed due to ParseHttp exception - slice bounds out of range.
guess the cause is length of packet.

help is needed.
Thanks!

Could you share the debug logs of Packetbeat inlcuding the whole error you see?

sure. there you go:

2018-02-16T13:59:05.332-0500 INFO instance/beat.go:468 Home path: [/usr/share/packetbeat/bin] Config path: [/usr/share/packetbeat/bin] Data path: [/usr/share/packetbeat/bin/data] Logs path: [/usr/share/packetbeat/bin/logs]
2018-02-16T13:59:05.335-0500 INFO instance/beat.go:475 Beat UUID: 418b8929-9abc-4c5b-9c17-ff0e13fd876e
2018-02-16T13:59:05.335-0500 INFO instance/beat.go:213 Setup Beat: packetbeat; Version: 6.2.1
2018-02-16T13:59:05.335-0500 INFO elasticsearch/client.go:145 Elasticsearch url: http://myelasticserver:9200
2018-02-16T13:59:05.335-0500 INFO pipeline/module.go:76 Beat name: server.name.fqdn
2018-02-16T13:59:05.336-0500 INFO procs/procs.go:78 Process matching disabled
2018-02-16T13:59:05.340-0500 INFO instance/beat.go:301 packetbeat start running.
2018-02-16T13:59:05.340-0500 INFO [monitoring] log/log.go:97 Starting metrics logging every 30s
2018-02-16T13:59:22.423-0500 ERROR runtime/panic.go:35 ParseHttp exception. Recovering, but please report this: runtime error: slice bounds out of range. {"stack": "github.com/elastic/beats/libbeat/logp.Recover\n\t/go/src/github.com/elastic/beats/libbeat/logp/global.go:88\nruntime.call32\n\t/usr/local/go/src/runtime/asm_amd64.s:509\nruntime.gopanic\n\t/usr/local/go/src/runtime/panic.go:491\nruntime.panicslice\n\t/usr/local/go/src/runtime/panic.go:35\ngithub.com/elastic/beats/packetbeat/protos/http.(*parser).parseHTTPLine\n\t/go/src/github.com/elastic/beats/packetbeat/protos/http/http_parser.go:179\ngithub.com/elastic/beats/packetbeat/protos/http.(*parser).parse\n\t/go/src/github.com/elastic/beats/packetbeat/protos/http/http_parser.go:107\ngithub.com/elastic/beats/packetbeat/protos/http.(*httpPlugin).doParse\n\t/go/src/github.com/elastic/beats/packetbeat/protos/http/http.go:293\ngithub.com/elastic/beats/packetbeat/protos/http.(*httpPlugin).Parse\n\t/go/src/github.com/elastic/beats/packetbeat/protos/http/http.go:224\ngithub.com/elastic/beats/packetbeat/protos/tcp.(*TCPStream).addPacket\n\t/go/src/github.com/elastic/beats/packetbeat/protos/tcp/tcp.go:115\ngithub.com/elastic/beats/packetbeat/protos/tcp.(*TCP).Process\n\t/go/src/github.com/elastic/beats/packetbeat/protos/tcp/tcp.go:208\ngithub.com/elastic/beats/packetbeat/decoder.(*Decoder).onTCP\n\t/go/src/github.com/elastic/beats/packetbeat/decoder/decoder.go:317\ngithub.com/elastic/beats/packetbeat/decoder.(*Decoder).process\n\t/go/src/github.com/elastic/beats/packetbeat/decoder/decoder.go:258\ngithub.com/elastic/beats/packetbeat/decoder.(*Decoder).OnPacket\n\t/go/src/github.com/elastic/beats/packetbeat/decoder/decoder.go:164\ngithub.com/elastic/beats/packetbeat/sniffer.(*Sniffer).Run\n\t/go/src/github.com/elastic/beats/packetbeat/sniffer/sniffer.go:193\ngithub.com/elastic/beats/packetbeat/beater.(*packetbeat).Run.func2\n\t/go/src/github.com/elastic/beats/packetbeat/beater/packetbeat.go:202"}
2018-02-16T13:59:35.342-0500 INFO [monitoring] log/log.go:124 Non-zero metrics in the last 30s {"monitoring": {"metrics": {"beat":{"cpu":{"system":{"ticks":10810,"time":10818},"total":{"ticks":58880,"time":58888,"value":58880},"user":{"ticks":48070,"time":48070}},"info":{"ephemeral_id":"a78c6431-fe65-4b55-9c43-624537eac328","uptime":{"ms":30015}},"memstats":{"gc_next":7913065664,"memory_alloc":5098753768,"memory_total":10229529576,"rss":6588616704}},"http":{"unmatched_responses":315004},"libbeat":{"config":{"module":{"running":0}},"output":{"type":"elasticsearch"},"pipeline":{"clients":12,"events":{"active":0}}},"system":{"cpu":{"cores":32},"load":{"1":1.83,"15":1.31,"5":1.57,"norm":{"1":0.0572,"15":0.0409,"5":0.0491}}},"tcp":{"dropped_because_of_gaps":14523}}}}
2018-02-16T14:00:05.342-0500 INFO [monitoring] log/log.go:124 Non-zero metrics in the last 30s {"monitoring": {"metrics": {"beat":{"cpu":{"system":{"ticks":18710,"time":18712},"total":{"ticks":105240,"time":105243,"value":105240},"user":{"ticks":86530,"time":86531}},"info":{"ephemeral_id":"a78c6431-fe65-4b55-9c43-624537eac328","uptime":{"ms":60014}},"memstats":{"gc_next":7542307872,"memory_alloc":7108923240,"memory_total":19867083776,"rss":2189402112}},"http":{"unmatched_responses":297639},"libbeat":{"config":{"module":{"running":0}},"pipeline":{"clients":12,"events":{"active":0}}},"system":{"load":{"1":2.12,"15":1.35,"5":1.66,"norm":{"1":0.0663,"15":0.0422,"5":0.0519}}},"tcp":{"dropped_because_of_gaps":18712}}}}
2018-02-16T14:00:15.803-0500 INFO beater/packetbeat.go:221 Packetbeat send stop signal
2018-02-16T14:00:15.884-0500 INFO instance/beat.go:308 packetbeat stopped.
2018-02-16T14:00:15.885-0500 INFO [monitoring] log/log.go:132 Total non-zero metrics {"monitoring": {"metrics": {"beat":{"cpu":{"system":{"ticks":22060,"time":22065},"total":{"ticks":130490,"time":130496,"value":130490},"user":{"ticks":108430,"time":108431}},"info":{"ephemeral_id":"a78c6431-fe65-4b55-9c43-624537eac328","uptime":{"ms":70557}},"memstats":{"gc_next":7383853248,"memory_alloc":4384858264,"memory_total":23284031896,"rss":8257155072}},"http":{"unmatched_responses":716837},"libbeat":{"config":{"module":{"running":0}},"output":{"type":"elasticsearch"},"pipeline":{"clients":12,"events":{"active":0}}},"system":{"cpu":{"cores":32},"load":{"1":2.24,"15":1.37,"5":1.7,"norm":{"1":0.07,"15":0.0428,"5":0.0531}}},"tcp":{"dropped_because_of_gaps":39174}}}}
2018-02-16T14:00:15.885-0500 INFO [monitoring] log/log.go:133 Uptime: 1m10.558590286s
2018-02-16T14:00:15.885-0500 INFO [monitoring] log/log.go:110 Stopping metrics logging.

It seems like a bug. Do you mind opening an issue? https://github.com/elastic/beats/issues/new

Opened.
Can we expect a close solution?
Thanks!

This topic was automatically closed after 21 days. New replies are no longer allowed.