Import bash output in Kibana, it's possible?

Hello everyone, can I import the output of a bash script into a kibana dashboard?

I have created the following script to search email sending in postfix but I want to put together something visually more attractive.

Here the script output:

==== Details Total Send To emawata@gmail.com ====
From: emanuel@emanuel-pc.democloud.com
Recipient: emawata@gmail.com
Date: Nov 2 09:28:10
LOG_ID: B1BA82A4F29
Status: status=sent (250 2.0.0 OK 1635856092 l13si15557346qtk.47 - gsmtp)

Here the code:

#!/bin/bash

if (( $# != 1 )); then
  printf -- '%s\n' "" "Ingresar cuenta de correo" "Ej.: $0 rcpt@rcpt.com" "" >&2
  exit 1
fi

mail_dest="${1:?No mail_dest defined}"
mail_log="/var/log/mail.log"

get_mail_queue_id() {
  grep -iw "to=<${mail_dest}>" "${mail_log}" | 
    grep -E "status=(sent|deferred|bounced)" | 
    awk -F"smtp" '{print $2}' | 
    cut -d":" -f2 | 
    sed -e s'/ //g' | 
    sort | 
    uniq
}

# Usage: get_mail_path [to|from] qid
get_mail_path() {
  grep "${2:?No QID supplied}" "${mail_log}" | 
    sed -r 's/^([^;]*;)[^;]*;/\1/' | 
    awk 'BEGIN{FS=OFS=" "} {print $7}' | 
    grep -w "${1:?No mail direction supplied}=" | 
    tr -d , | 
    grep -E -o "\b[A-Za-z0-9._%+-]+@[A-Za-z0-9.-]+\.[A-Za-z]{2,6}\b"
}

get_mail_date() {
  grep "${1:?No QID supplied}" "${mail_log}" | 
    sed -r 's/^([^;]*;)[^;]*;/\1/' |
    awk 'BEGIN{FS=OFS=" "} {print $1, $2,$3; exit}' 
}

get_mail_dsn(){
grep "${1:?No QID supplied}" "${mail_log}" |  
    awk -F "," '{print $6}' | 
    uniq | 
    sed '/^$/d'
}

printf -- '\n==== %s ====\n' "Detalle Total de Envios a <$1>"

for qid in $(get_mail_queue_id); do
  mail_source="$(get_mail_path from "${qid}")"
  mail_dest="$(get_mail_path to "${qid}")"
  mail_date="$(get_mail_date "${qid}")"
  mail_dsn="$(get_mail_dsn "${qid}")"

  printf -- 'From: %s\nDestinatario: %s\nFecha: %s\nLOG_ID: %s\nEstado de entrega: %s\n\n' \
    "${mail_source}" "${mail_dest}" "${mail_date}" "${qid}" "${mail_dsn}"
done

exit 0

I would save the results of the script into a local file and ingest it using Logstash File input and parse using Grok or Dissect. You might want to adjust how it writes the data to the log in order to parse easier.

1 Like

I agree with aaron-nimocks, save output to a file and reader with Logstash. Also you can expand your the script to use bulk insert in JSON format directly to Elasticsearch. And 3rd option which I haven't try: Exec input plugin | Logstash Reference [7.15] | Elastic

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.