Logstash sending json written by Java

I currently have code that writes a json array to a file. However, logstash only seems to attempt to parse it if I manually go in and touch the file myself, rather than take it in after the initial writing to the file. How could I change this behavior so that the json array is sent to logstash as soon as the initial write in my java program occurs? My config file is as follows:

        path => "/root/test_last/complete/example.json"
        codec => "json"
        start_position => "beginning"
        ignore_older => 0
filter     {
    mutate   {
            gsub => [ "message","\[",""]
            gsub => [ "message","\n",""]
            gsub => [ "event","\},\{",","]
    json { source => message }

    stdout { codec => rubydebug }

'ignore_older => 0' says to ignore any files more than zero seconds old, which is all files. You should remove this option.

When you say you are writing an array to the file does that mean you are appending to it?

It is unlikely you want both a json codec and a json filter. And if you have a json codec on a file input you do not have a field called message, so your filters will have no effect.

Just a singular array containing a bunch of json objects.

OK, so if you remove the ignore_older option does it work as expected?

I start getting this error in terms of parsing:

exception=>#<LogStash::Json::ParserError: Unexpected character (':' (code 58)): expected a valid value (number, String, array, object, 'true', 'false' or 'null')

Your JSON is not valid JSON. For example, this will produce that error

[ { "foo" :: 1 } ]

Hm, thats interesting because I put my json through a jsonlint to validate it and it said it was a valid json...

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.