I'm relatively new to the ELK stack. I'm wondering how I could send data from a SQL table to Elasticsearch in a way that every time it runs, it only sends the rows that are new since the last time it ran. The SQL table contains Horizon View event logs.
Currently, for testing, I have setup JDBC connection which successfully gathers the content of the SQL table and sends it via Logstash to Elasticsearch. The problem with this is, it sends the entire table every time. So I'm looking for an Elasticsearch solution that either monitors the SQL table and sends information to Elasticsearch as rows get added to the table, or a way to prevent Logstash from sending the entire table every time, so I could even setup a scheduled transfer.
Maybe with beats?