No automatic logstash shutdown on Elasticsearch Input Plugin

I have an index X which continuosly receives maschine logs. I need to reindex the data constantly to index Y which is connected to Kibana. I have a Elasticsearch input, filter and Elasticsearch output. The reindexing works fine but after all documents have been reindexed, Logstash automatically shuts down because probably one of the elasticsearch plugins sends out a shut down command.

However, index X still receives logs which are then not stashed into index Y and in consequence cannot be seen in Kibana. I don't think it's a problem of my configuration, it's just the way the Elasticsearch plugin is programmed. I know a possible solution would be an exec plugin which starts the pipeline again and again in a given interval. But since the pipeline is starting and shutting down every time this would be very ugly and not ressource friendly.

Is there any way to keep the pipeline with an Elasticsearch Input alive and not shutting it down automatically?

Thanks in advance

Yes, nowadays there's a schedule option to make the ES query at certain intervals.

Thanks for your answer. But what do you mean by "query at certain intervals"? Like Logstash quitting and starting again what I do not want or that the Elasticsearch Input keeps on running without shutting down (when did not received a sigterm) what I do want?

Like Logstash quitting and starting again what I do not want

No.

or that the Elasticsearch Input keeps on running without shutting down (when did not received a sigterm) what I do want?

Yes.

Okay so what is the trick? From my understanding the Elasticsearch Input quits when queried all documents automatically from default.

Ah I found it. Thanks

@magnusbaeck Is the only option to query all of the documents again with the schedule option? Cause this takes really long with a lot of documents and is not ressource friendly. I want elasticsearch input plugin only to watch out for new documents.

How would it know which documents are new?

What sends the data to index X? Can you insert yourself there and fork the data stream so it's sent to both index X and index Y?

I do not know to be honest. How does Kibana watch out for new documents? Does it query the entire index again and again?

This is difficult and I need to reindex anyway because otherwise most fields cannot be parsed correctly by Kibana. Changing the logging mechanism on the maschine side is not possible.

Thank you.

I do not know to be honest. How does Kibana watch out for new documents? Does it query the entire index again and again?

Yes, Kibana doesn't make incremental queries.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.