HTTP_Poller Plugin - Guidance Is Appreciated

Hello,

I'm trying to bring in data to logstash via http.

My application's logs are hosted on http in this format: http://subdomain.domain.com:portnumber/path_to_log/

Now, my question is, how can I configure my logatash configuration file to read from this endpoint?

My configuration file:

input {
http_poller {
url => {
test1 => {
method => get
url => "http://subdomain.domain.com:port/path_to_log/"
header => {
Accept => "application/json"
}
}
}
request_time => 60
interval => 60
codec => "json"
metadata_target => "http_poller_metadata"
}

filter {
date {
match => [ "timestamp", "HH:mm:sss.SSS"]
}
}
}
output {
elasticsearch {
hosts => ["localhost:9200"]
user => "user"
password => "password*"
}
stdout {
codec=>rubydebug
}
}

And in what way is the configuration above not working?

My apologies. I actually got it to work, but since this question then I've managed to successfully setup a pipeline from filebeat => logstash => elasticsearch => kibana. Ideally, I would of liked for either logstash or filebeat to point to the url's where the logs are hosted and only poll new events. Not sure if this would of worked, but I think I will have to settle for writing a script that downloads the logs to the directory where filebeat is pointing to on a scheduled interval. I know that in an ideal situation it is preferable to have filebeat sitting in the server where the logs are produced, but my boss doesn't want anything outside of the application we support running in production.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.