I have 2 records in my sql server and i have integrate the sql server with elk stack now when i run my logstash file it works fine. Can is this possible if i enter the third record in sql server the data will parse in kibana automatically without running the logstash file ? or everytime i dump data in sql server i have to run the logstash file for parsing is any way around for this? what i want that when i dump the data in the sql server it will show in the kibana.
If you are using logstash - you will have to update and run it for the data to get into elasticsearch. And then only it will show up in Kibana.
All data updates need to be in elasticsearch for them to show up in Kibana.
if i am not wrong you are saying that if i have to update the data in kibana i have to run logstash everytime? is there any process to update the data automatically in elasticsearch? let say i have parse 10 records in elk stack through sql server but the process is running on the background and i want to dump another 10 records in sql server is there any possiblity without running logstash another time the data will be imported to elasticsearch?
is their a way without updating the logstash manually is their a way to do it automatically
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.