Http output for filebeat?

Hello,
We have some need where we need to send logs from FB to datapower. We cant send filebeat output directly to ES or logstash.
Our LS and ES components are inside secure zone. Some clients are outside secure zone, where we will install filebeat and output those logs to logstash via datapower. datapower then just send those logs using pass through url.

Here I am facing problem while configuring FB output to datapower. I need some kind of FB output which will send http message ot datapower and datapower will pass through it to :

Cant we have http output in filebeat llike we have http input to logstash?
any suggestion???

br,
Sunil.

The elasticsearch output is based on http. Does this help?

Hi,
I need to try this out...

Hi,
When try to use ES output, getting below error:

2016-10-04T17:10:54+05:30 DBG  ES Ping(url=http://<DP-host>:8455, timeout=1m30s)
2016-10-04T17:10:54+05:30 DBG  Ping request failed with: 405 Method Not Allowed
2016-10-04T17:10:54+05:30 INFO Connecting error publishing events (retrying): 405 Method Not Allowed

br,
Sunil

which filebeat version are you using? The ES Ping used to use HEAD requests, unfortunately disallowed by some HTTP proxies. 5.0 will use GET.

For generic HTTP see this discussion: Output beat events as plain HTTP POST

Code here: https://github.com/raboof/beats-output-http

You have to compile the beat yourself though + it's not really supported by elastic.

Hello,
I allowed head on the datapower host.
It now shows error

2016-10-18T12:33:17+05:30 DBG  Sending bulk request to http://<datapower-host>:8455/_bulk
2016-10-18T12:33:18+05:30 ERR Failed to perform any bulk index operations: invalid character 'o' looking for beginning of value
2016-10-18T12:33:18+05:30 INFO Error publishing events (retrying): invalid character 'o' looking for beginning of value
2016-10-18T12:33:18+05:30 INFO send fail

is this something related to the json object being sent?? When I retrieve json object from filebeat debug log and sent to same host:port from browser REST interface then its sent successfully.
br,
Sunil

The elasticsearch output sends batches of records to Elasticsearch using the bulk API. This is done for performance reasons as sending a single event per request is inefficient. I therefore suspect it may be difficult to get this to work directly with your system.

One way to possibly get around this without adding a custom output to filebeat, could be to have filebeat send data to Logstash and then use the Logstash HTTP output plugin to send data to your system.

the bulk API response should be a JSON object itself. Parsing seems to fail on the response.

Which exact filebeat version are you using? Can you capture the HTTP request and response using tcpdump?

This topic was automatically closed after 21 days. New replies are no longer allowed.