How to remplace my bash script by logstash?

Hello, (Ah before all, excuse me for my bad english !)

Currently :

I developed many many many BASH scripts to check text files, manipulate with zcat, awk, grep, sum... Results are send into a stats file.

My Cacti come take results into this stats files and generate a graph.

BUT, they are lots of data and Cacti begin to fall. I want remplace it by the elastic stack (and make a real time) 0!

I have finished installation but now, i search how to integrated my script into logstash ?

In first It's possible or not ?

FOR EXAMPLE :

#/BIN/BASH
#Script1

zcat files_text.gz | grep enterprise | awk -F "|" '{print $14"|"strtonum("0x"$22)"|"$31"|"strtonum("0x"$33)}' > $tmp

echo $tmp :

enterprise1|100 000 | UK | 20160212
enterprise2| 4 500 | ALL | 20140214
enterprise3| 25 000 | ESP | 20150218
enterprise4| 77 000 | ITA | 20150213

print $1 , $2 >> Centralized_stats_files.

EXAMPLE OF Centralized_stats_files :

echo /stats/Centralized_stats_files

Script1,enterprise,100 000
Script1,enterprise2,4500
Script1,enterprise3,25000
Script1,enterprise4,77000
Script2,[...]
Script2,[...]
[...]

Cacti come retrieved two last fields data and genere graph.

Can i have the same processing with ELK ?

I know that my query seems stupid but i want learn about this techno :confused:

One again excuse for my english :confused:

Cordialy

Have a good day

Yes it is possible

You would use a standard file input filter, though I am not sure it has a GZIP codec but try an uncompressed file first https://www.elastic.co/guide/en/logstash/current/plugins-inputs-file.html

You would then use something like the GROK Filter though I would PARSE all the data even if you don't want to use it. Incase later on you want to do other things.
https://www.elastic.co/guide/en/logstash/current/plugins-filters-grok.html

Grok config would be a simple grok filter something like

filter{
grok {
      match => ["message","%{WORD:enterprise}\|%{NUMBER:stat:int}"]
}
}
output{
   stdout{ codec=rubydebug }
}

You can test your GROK Statements @ https://grokdebug.herokuapp.com/

Then for the OUTPUT
you will have to use one of the OUTPUT's https://www.elastic.co/guide/en/logstash/current/output-plugins.html
But since cacti only probes for new data though SNMP and scripts, I am not sure how you would insert the data but read over the outputs and see if something makes sense for you

a simple output to STDOUT would be

output{
    stdout{ codec=>rubydebug}
}

hope this helps

Hummm ....

Excuse me but this a new world for me.

Before :

1 : Creating script
2 : Writing output in file
3 : Retrieve output in this file to generate graph.

Now (with logstash) :

1: Creating script ? like a :
exec {
command => cat file.gz | awk '{print $1,$2$3}' | sum{$2}
interval => 1 hour
}
???????

2:Writing output in file or show with echo command ???

3:Output is send in elasticDB ?

Can you help me for my first graph ?

I mean, my script stay the same ? or I must export it in a logstash file configuration ?..

I am loose in the hood

Its your environment if you prefer to run the script go for it, but I prefer using the tools the way they are designed, though that is not right for all purposes. Especially if you have other processes depending on it.

However the exec script I don't think it will deal with "|" signs but if it works :slight_smile: Great!
re

Can you explain me some things ?

So in "/etc/logstash/conf.d" I haven't any file.

If i want make a pipeline to inject data in Elasticsearch, i configure it in "/usr/share/logstash/bin/" (for example to perform bash command).

So, when i look the logstash log, i can see that "No config files found in path {:path=>"/etc/logstash/conf.d/*"}".

What i do make about that ? It's a problem if Logstash have any conf.d file ? Have you a example of conf file ?

THANK YOU FOR YOUR HELP :blush:

Here a exemple of a script who is in "/usr/share/logstash/bin/"

cat first_pi

input {
file {
path => "/home/user/ip.log"
start_position => "beginning"
type => "logs"
}
}

filter {
grok{
match=>{
"message"=>"%{IP:clientip}"
}

output {
elasticsearch {127.0.0.1}

It's a good begin ... :slight_smile:

But i don't know what want logstash in conf.d/<config_file_logstash>

well take a look at the /etc/syconfig/logstash

you don't have to have it there, but this is the default

$DIRECTORY/logstash -f

btw you don't have to specify an asterisk "*" logstash can take directory or filename. If a directory it will load all files in it

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.