Parsing a json file in logstash?

I'm sending log messages from a remote client to my ELK server, and all is well. The logs come in a json format, that looks something like

{
        "type" => "trm-system",
        "host" => "susralcent09",
   "timestamp" => "2016-09-01T16:21:54.762437-04:00",
    "@version" => "1",
    "customer" => "cf_cim",
        "role" => "app_server",
  "sourcefile" => "/usr/share/tomcat/dist/logs/trm-system.log",
    .........
}

In my logstash configuration files, how do I parse the value of "sourcefile" to ultimately get the filename. e.g. trm-system.log? I then want to use the result (the filename) to create the file locally in some local path.

Well, that's not a JSON file so it'll take some work to parse. You might be able to use the kv filter. You'll also have to use a multiline codec to join an multiple physical lines into a single logical event.

Thanks. Well at least that's the output that's shown in my stdout. I have the following plugin in my output section of my Logstash configuration files.

input {
  tcp {
    port => 5514
    codec => json
  }
}

output {
  stdout {
    codec => rubydebug
  }
}

I changed the codec to json in the stdout plugin, and I get this

{"type":"trm-system","host":"susralcent09","timestamp":"2016-09-01T16:37:50.221220-04:00","@version":"1","customer":"cf_cim","role":"app_server","sourcefile":"/usr/share/tomcat/dist/logs/trm-system.log", ....}

Okay, now I get what you're really asking. Your question is completely unrelated to JSON, you just want to extract the filename from a filepath. Use a grok filter that matches against the sourcefile field and captures everything after the last slash:

grok {
  match => {
    "sourcefile" => "/(?<filename>[^/]+)$"
  }
}

Thanks! it worked.