Output Exec Script

Hello, I’m attempting to read a log file and run a script as output.

The script accepts STDIN data, but I don’t think I’ve set up the log stash configs properly since the script never gets executed the way it’s currently set up. (I've tested the script without logstash and it's functioning properly, I'm guessing there's something amiss in the way I've set it up.)

input {
    file {  
        path  => "/var/logs/foo.log"
        codec => "plain"
    } 
}

output {
    exec {
        command => "php /bin/scripts/process.php"
    }
}

Try using the full path to the script?

Sorry, I actually am, I changed it for the sake of this question. Both file paths are fully qualified. (I've edited the question to include full paths).

Does LS have privs to execute that script? Try enabling debug when you run LS as well.

However if you are running this a few times and not changing the input path, chances are you are running into a sincedb problem and the file will not be read which means the script won't be run.

I've been executing this repeatedly during tests:

echo "ABC" >> foo.log 

I just added this to my output:

stdout {
    codec => rubydebug
}

This is what I see in the logstash.stdout:

{
    "message" => "ABC",
    "@version" => "1",    
    "@timestamp" => "2015-06-16T00:58:16.577Z",
    "host" => "dev.neoform",
    "path" => "/var/logs/foo.log"
}

I tested the configs, I says nothing is wrong.

Maybe I'm making a mistake here, the command being run, is logstash sending it any data via STDIN? Do I need to pipe it explicitly?

Yes, you'll need to pass one or more of the fields that Logstash creates to the script as command-line arguments. This doesn't happen automagically. So try the following output:

output {
    exec {
        command => "php /bin/scripts/process.php %{message}"
    }
}

The exec as you had it just guarantees that it will execute for each document processed by logstash, not that it will process each document. So you can use the exec to do something wholly separate from each document if you wanted.

Is there a more appropriate way to pass STDIN content to a script in this way? If I do:

echo "%{message}" | php /bin/scripts/process.php

That would work, but doesn't seem very secure...

Not secure in the sense that you can see "%{message}" briefly in the process table when the exec runs or in the sense that you passing arbitrary data to a script?

For the former, there is not much you can do. For the latter, this would be something you ultimately need to handle yourself by making sure your script isn't prone to common security issues.

If you can eliminate what the script is doing with a combination of filters and outputs in Logstash then that's your best bet. If you can't replace the logic in the script, you could modify it to either listen on a TCP/UDP port or UNIX socket and then use either the tcp, udp or file (for a socket) outputs respectively to solve the first security problem above.

Well, it's insecure in the sense that I'm just echoing data into a command and executing. Considering the messages are logs from a web server, a lot of the data I'm pumping in is user generated.

I was hoping there was a way to execute a command and have logstash pass the data via STDIN without having to pipe it in a command (as shown above), at least not without first escaping the content of the message to make sure it doesn't have unescaped quotes in it, as that would break this:

echo "%{message}" | php /bin/scripts/process.php

Making sure STDIN is properly quoted is only solving one aspect of security here. Ultimately, your fundamental problem is you are passing arbitrary data to a script that executes on the server running Logstash :frowning:

I just realised you might be able to use the pipe output, but this isn't any more secure than using exec.

If the input was being passed via STDIN, there wouldn't be any risk, I can easily handle the input in a safe way via the script. The issue is how to get it piped into the script without just dumping it into the command-line.

I believe the pipe output only works if I have a continuously running command listening to the pipe, no?

I was hoping to be able to run this script once for each line in the log file...