Unable to parse multiline json data into logstash

I have tried to parse my json data into logstash data with separate fields. But unable to parse my data, logstash console struck on pipeline starts line
input :

    {
    	"id": 1,
    	"first_name": "Frank",
    	"last_name": "Mills"
       }

my config is

  input {
       file {
            codec => multiline { 
                pattern => "^\{" 
                negate => true 
                what => previous 
                 auto_flush_interval => 1
                max_lines => 2000 
            } 
    
            path => [ 'D:/sample.json'] 
            start_position => "beginning" 
            sincedb_path => "NULL" 
        } 
    }
       filter {   
        grok { 
            match => { "message" => "^.*?(?<logged_json>{.*)" } 
        }    
        mutate { 
            gsub => [ 'logged_json', '\n', '' ] 
            remove_field => [ "message", "@timestamp", "host", "path", "@version", "tags"] 
        } 
        # parse the json and remove the string field upon success
        json { 
            source => "logged_json" 
            remove_field => [ "logged_json" ] 
        } 
    } 
    
    output 
    { 
        stdout { codec => rubydebug }
      # Sending properly parsed log events to elasticsearch
    elasticsearch {
    hosts => ["localhost:9200"]
    index => "test-metrics437"
    }
    
    }

Any help on this?

You example file does not have a [ in it, so logstash will hang waiting for a line that starts with [. If you use auto_flush_interval then the codec will flush an event after reading all the non-matching lines and getting a timeout.

Hi @Badger

I have updated my config file. please check updated config file, but still it hangs

That codec configuration will flush an event when it sees the second line that starts with {. If there is only one object pretty-printed in the file then it will never flush. As I said, use auto_flush_interval.

Hi @Badger

After using auto flush, i got this error

That's not a complete JSON object, so an error is expected.

Hi @Badger
No for checking i have precise my input, still it is valid JSON right, how to fix this

Hi @Badger

I have checked again, even if we use auto_flush_interval also still hangs again. sometimes working if cut and paste input again. dunno what happens here.

Just as an FYI, please don't post pictures of text or code. They are difficult to read, impossible to search and replicate (if it's code), and some people may not be even able to see them :slight_smile:

Hi @Badger

Any update on this.

Thanks

One thing I notice is that you have sincedb_path => "NULL". That will persist the in-memory sincedb to a file called NULL in the working directory for logstash. It should be "NUL" instead.

Ok @Badger i will try this