We just tried to ingest SQL data into Elasticsearch using Agent (SQL Input Integration).
What I observed is that it stores the same set of data repeatedly.
For Example, if I have 5 rows in the SQL table, the first time it is ingesting the first 5 rows, it repeats again (keep on ingesting the 5 rows again).
Is there anyway to configure the agent accordingly to ingest only 5 rows and incremental data only?
I made changes to the SQL query, but they were of no use.
Finally, in the SQL Input Integration, is there a way to hide the password where, within hosts, we are openly giving the password?
Are there any alternative ways to ingest through integrations other than depending on Logstash?