My cURL command in the input below invokes an https endpoint that will stream data until interrupted. My config below works but no data is loaded to Elasticsearch until I interrupt Logstash. Logstash keeps the data in memory until I do Ctrl^C to kill the pipeline.
input {
exec {
command => 'curl -X GET "https://feed.test.com/1.0/json/A98F049" -H "api_key:zBAHA"'
interval => 600000000
condec => "json_lines"
}
filter {
json { source => message}
}
}
output {
stdout {
codec => rubydebug
}
elasticsearch {
hosts => "localhost:9200"
index => "abc"
}
}
My screen shows some stats like this which update in real time showing time since start and bytes received; bytes xfered stays zero:
% Total % Received % Xferd Average Speed Time Time Time Current
0 0 0 0 0 0 0 0--:--:-- 0:00:28 --:--:-- 887
but only after killing the pipeline, the data shows up on the screen and in Elasticsearch. How can I index the data as it arrives continuously? Changing the interval value didn't make a real change.