Base64 decode

Hello all,
I am working on something I have never worked on before and I really do not know where to go from here and I am hoping someone might have some direction for me to attempt trying to get this parsed to Elasticsearch properly.
I have been given a sample JSON file to parse to ES using logstash however they are requesting that we if we want to review the relevant information, we would have to decode via base64. Given that I have never worked on something like this before, I did a bit of research which led me to install the cipher plugin for logstash. As I continued down this path, I realized that I truly did not know where to go from here so I am hoping someone can assist on where to at least start as I attempt to write this pipeline. I have provided what I have so far (my filter is where my issue is) along with a sample record, hoping for some reference documentation or an understanding of how to write this.

Logstash conf.d

input {
  file {
    path => "/var/log/logstash/test/test/audit.txt"
    codec => "json"
  }
}

filter {
  grok {
    match => ["message" => "%{TIMESTAMP_ISO8601:timestamp}


output {
    elasticsearch {
      hosts => [ "https://server.com:9200" ]
      ssl => true
      ssl_certificate_verification => false
      cacert => "/etc/logstash/config/certs/newfile.crt.pem"
      user => "elastic"
      password => "elastic_test_p@ssw0rd"
#       document_type => "doc"
      index => "test"
   }
 }

Log sample-

[
[
   {
      "2019-06-12 10:05:28 [UTC-0400]" : [
         {
            "user" : "admin",
            "address" : "10.0.1.6",
            "type" : "admin",
            "activity" : "Login"
         }
      ]
   },
   {
      "2019-06-12 10:38:36 [UTC-0400]" : [
         {
            "filetype" : "filter",
            "user" : "admin",
            "data" : [
               {
                  "diffprev" : [
                     "Y29tLnByb29mcG9pbnQuZmlsdGVyLm1vZHVsZS5hY2Nlc3MucnVsZXM9c3lzdGVtLHNwbGl0LGxhbWUsbGltaXQsYmxvY2tlZCx0cnVzdGVkLHZlcmlmaWVkLG1heHNpemUsbWF4c2l6ZTIsYXR0YWNoZXJyLGV4ZXN0cmlwLHBwX2FudGlzcG9vZixvZmZlbnNpdmUsc3BhbXNhZmUsc3BhbWJsb2NrLGhvbmV5cG9pbnQsbWF4ZHVyYXRpb24sc3ViamVjdGxpbmVfZW5jcnlwdCxkZXNrdG9wX3BsdWdpbl9lbmNyeXB0LHJlc3BvbnNlX2VuY3J5cHRfZXh0ZXJuYWwscHBfZXh0ZXJuYWxfdGFnLHRsc3Rlc3QxLHRsc3Rlc3QyLGRpc2NhcmRfYmFkX3JjcHRzLG1heHNpemVfZXhjbHVkZSx0ZXN0LGZvb2Jhcixmb29iYXIyLHRlc3RsaXN0LHRlc3R5LG91dGJvdW5kX3Rocm90dGxl"
                  ],
                  "diffrev" : [
                     "Y29tLnByb29mcG9pbnQuZmlsdGVyLm1vZHVsZS5hY2Nlc3MucnVsZXM9c3lzdGVtLHNwbGl0LGxhbWUsbGltaXQsYmxvY2tlZCx0cnVzdGVkLHZlcmlmaWVkLG1heHNpemUsbWF4c2l6ZTIsYXR0YWNoZXJyLGV4ZXN0cmlwLHBwX2FudGlzcG9vZixvZmZlbnNpdmUsc3BhbXNhZmUsc3BhbWJsb2NrLGhvbmV5cG9pbnQsbWF4ZHVyYXRpb24sc3ViamVjdGxpbmVfZW5jcnlwdCxkZXNrdG9wX3BsdWdpbl9lbmNyeXB0LHJlc3BvbnNlX2VuY3J5cHRfZXh0ZXJuYWwscHBfZXh0ZXJuYWxfdGFnLHRsc3Rlc3QxLHRsc3Rlc3QyLGRpc2NhcmRfYmFkX3JjcHRzLG1heHNpemVfZXhjbHVkZSx0ZXN0LGZvb2Jhcixmb29iYXIyLHRlc3RsaXN0LHRlc3R5LG91dGJvdW5kX3Rocm90dGxlLHJlZ2V4X2Z1bl8wMQ=="
                  ]
               },
               {
                  "diffprev" : null,
                  "diffrev" : [
                     "Y29tLnByb29mcG9pbnQuZmlsdGVyLm1vZHVsZS5hY2Nlc3MucnVsZS5yZWdleF9mdW5fMDEucm91dGVzPQ==",
                     "Y29tLnByb29mcG9pbnQuZmlsdGVyLm1vZHVsZS5hY2Nlc3MucnVsZS5yZWdleF9mdW5fMDEubWFpbC5lbmFibGU9dA==",
                     "Y29tLnByb29mcG9pbnQuZmlsdGVyLm1vZHVsZS5hY2Nlc3MucnVsZS5yZWdleF9mdW5fMDEubWFpbC5hdHRyaWJ1dGVzPW5vbmU=",
                     "Y29tLnByb29mcG9pbnQuZmlsdGVyLm1vZHVsZS5hY2Nlc3MucnVsZS5yZWdleF9mdW5fMDEubWFpbC5hY3Rpb25zPTEsMg==",
                     "Y29tLnByb29mcG9pbnQuZmlsdGVyLm1vZHVsZS5hY2Nlc3MucnVsZS5yZWdleF9mdW5fMDEubWFpbC5hY3Rpb24uMj1kaXNjYXJk",
                     "Y29tLnByb29mcG9pbnQuZmlsdGVyLm1vZHVsZS5hY2Nlc3MucnVsZS5yZWdleF9mdW5fMDEubWFpbC5hY3Rpb24uMT1xdWFyYW50aW5lIGZvbGRlcj0iUXVhcmFudGluZSI=",
                     "Y29tLnByb29mcG9pbnQuZmlsdGVyLm1vZHVsZS5hY2Nlc3MucnVsZS5yZWdleF9mdW5fMDEuaHR0cHBvc3QuYXR0cmlidXRlcz1ub25l",
                     "Y29tLnByb29mcG9pbnQuZmlsdGVyLm1vZHVsZS5hY2Nlc3MucnVsZS5yZWdleF9mdW5fMDEuZGVzY3JpcHRpb249",
                     "Y29tLnByb29mcG9pbnQuZmlsdGVyLm1vZHVsZS5hY2Nlc3MucnVsZS5yZWdleF9mdW5fMDEuY29uZGl0aW9ucz18fCwx",
                     "Y29tLnByb29mcG9pbnQuZmlsdGVyLm1vZHVsZS5hY2Nlc3MucnVsZS5yZWdleF9mdW5fMDEuY29uZGl0aW9uLjE9bXNnKHFyL2Zvby9pKQ=="
                  ]
               }
            ],
            "type" : "config"
         }
      ]
   },

I think the cipher filter is overkill. You can do this in ruby

ruby { code => 'event.set("decoded", Base64.decode64(event.get("message")))' }

will turn

   "message" => "Y29tLnByb29mcG9pbnQuZmlsdGVyLm1vZHVsZS5hY2Nlc3MucnVsZS5yZWdleF9mdW5fMDEubWFpbC5hY3Rpb24uMT1xdWFyYW50aW5lIGZvbGRlcj0iUXVhcmFudGluZSI=",

into

   "decoded" => "com.proofpoint.filter.module.access.rule.regex_fun_01.mail.action.1=quarantine folder=\"Quarantine\""

Badger, just to clarify, your response just used one of his fields as an example right? Since he has some fields encoded in base64 and others not i figured you just used "message" as an example but in reality based on his log example do we need to switch:

event.get("message")

with

event.get("[message][embedded][field]["diffprev"]")

where diffprev appears to be one of his encoded fields? Please let me know.

Correct. The sample log is not valid JSON so there is no way to know what fields should be referenced.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.