Logstash doesn't load data until interrupted

My cURL command in the input below invokes an https endpoint that will stream data until interrupted. My config below works but no data is loaded to Elasticsearch until I interrupt Logstash. Logstash keeps the data in memory until I do Ctrl^C to kill the pipeline.

input {
   exec { 
      command => 'curl -X GET "https://feed.test.com/1.0/json/A98F049" -H "api_key:zBAHA"'
      interval => 600000000 
      condec => "json_lines"
   }
filter {
   json { source => message}
}
}
output {
  stdout {
    codec => rubydebug
  }
elasticsearch {
   hosts => "localhost:9200"
   index => "abc"
  }
}

My screen shows some stats like this which update in real time showing time since start and bytes received; bytes xfered stays zero:

% Total % Received % Xferd Average Speed Time Time Time Current
0 0 0 0 0 0 0 0--:--:-- 0:00:28 --:--:-- 887

but only after killing the pipeline, the data shows up on the screen and in Elasticsearch. How can I index the data as it arrives continuously? Changing the interval value didn't make a real change.

The Exec Input Plugin doesn't support streaming. It waits for the process to complete and then passes the output through to the codec to generate Events.

Thank you for clearing the dust. It has been a long road to Exec Input. I had started with http and http_poller as shown here: How to set up input for https curl and header key? without success. Which plugin is best-practice for streaming that supports continuous indexing?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.