Hello, (Ah before all, excuse me for my bad english !)
Currently :
I developed many many many BASH scripts to check text files, manipulate with zcat, awk, grep, sum... Results are send into a stats file.
My Cacti come take results into this stats files and generate a graph.
BUT, they are lots of data and Cacti begin to fall. I want remplace it by the elastic stack (and make a real time) 0!
I have finished installation but now, i search how to integrated my script into logstash ?
In first It's possible or not ?
FOR EXAMPLE :
#/BIN/BASH
#Script1
zcat files_text.gz | grep enterprise | awk -F "|" '{print $14"|"strtonum("0x"$22)"|"$31"|"strtonum("0x"$33)}' > $tmp
echo $tmp :
enterprise1|100 000 | UK | 20160212
enterprise2| 4 500 | ALL | 20140214
enterprise3| 25 000 | ESP | 20150218
enterprise4| 77 000 | ITA | 20150213
print $1 , $2 >> Centralized_stats_files.
EXAMPLE OF Centralized_stats_files :
echo /stats/Centralized_stats_files
Script1,enterprise,100 000
Script1,enterprise2,4500
Script1,enterprise3,25000
Script1,enterprise4,77000
Script2,[...]
Script2,[...]
[...]
Cacti come retrieved two last fields data and genere graph.
Can i have the same processing with ELK ?
I know that my query seems stupid but i want learn about this techno
One again excuse for my english
Cordialy
Have a good day