hi, im new with ELK, wanna ask, can ELK monitor job from Apache airflow, SQL or other applications? and can elk monitor comprehensivly like give information about history of the job, last run time of the job, condition of the job (error,idle etc) and alert when there is an error
yes it can
you have to write some kind of code to retrieve that data from them and store in ELK
okayy thanks!, is there any best practice about that?
you question is very vague.
if you can give us more detail description then someone in forum can give you more detail.
but from broad prespective lets say you write python code to pull data from log file of that application?
or transfer metric from that app directly to elastic?
okay bro, for example give me best practice for monitoring MsSQL job on ELK, and Monitoring Airflow Apache job on ELK, i think i'm gonna look from the application syslog (MsSQL/Apache Airflow)
for mysql I will write python code to pull data from mysql, normalize it before saving in to ELK
for log I will install filebeat on log server and send data to elastic. if I need to parse the log like and it is little complicated parsing then I will send filebeat data to lostash and parse there and save it in elastic
ok thanks sachin!, i ll let u know if i have another question