Logstash error for the multiline json file


(Subash) #1

I am getting below logstash error for the multiline json file.

Error:

[2018-02-19T19:59:01,332][ERROR][logstash.agent           ] Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"LogStash::ConfigurationError", :message=>"Expected one of #, {, } at line 9, column 25 (byte 210) after input {\n\t#stdin { }\n\tfile {\n\t\tpath => \"/home/sdc/kibana_test.json\"\n\t\tstart_position => \"beginning\"\n\t\tsincedb_path => \"/dev/null\"\n        #        codec => \"json\"\n\t\tcodec => multiline {\n\t\t\tpattern => \"^(\\n+|\\{\"", :backtrace=>["/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:42:in `compile_imperative'", "/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:50:in `compile_graph'", "/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:12:in `block in compile_sources'", "org/jruby/RubyArray.java:2486:in `map'", "/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:11:in `compile_sources'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:51:in `initialize'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:171:in `initialize'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline_action/create.rb:40:in `execute'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:335:in `block in converge_state'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:141:in `with_pipelines'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:332:in `block in converge_state'", "org/jruby/RubyArray.java:1734:in `each'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:319:in `converge_state'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:166:in `block in converge_state_and_update'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:141:in `with_pipelines'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:164:in `converge_state_and_update'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:105:in `block in execute'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/stud-0.0.23/lib/stud/interval.rb:18:in `interval'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:94:in `execute'", "/usr/share/logstash/logstash-core/lib/logstash/runner.rb:343:in `block in execute'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/stud-0.0.23/lib/stud/task.rb:24:in `block in initialize'"]}" "

Conf file:

input {
        #stdin { }
        file {
                path => "/home/sdc/kibana_test.json"
                start_position => "beginning"
                sincedb_path => "/dev/null"
        #        codec => "json"
                codec => multiline {
                        pattern => "^(\n+|\{)"
                        what => "previous"
                        negate => "true"
                        auto_flush_interval => 1
                        max_lines => 10000000
                        max_bytes => "200 MiB"
                }
        }
}

filter {
        mutate {

                remove_field => [ "path", "tags", "host", "@version", "@timestamp"]
                }
                mutate {
                          convert => { "duration" => "integer" }
                          convert => { "runid" => "integer" }
                          convert => { "changelist" => "integer" }
                          convert => { "serverid" => "integer" }
                          convert => { "testpathid" => "integer" }
                          convert => { "resultrootid" => "integer" }
                          convert => { "statusid" => "integer" }
                          convert => { "resultid" => "integer" }
                        }
                        date {
                                match => [ "starttime", "YYYY-MM-dd HH:mm:ss" ]
                                target => "starttime"
                        }
                        date {
                                match => [ "endtime", "YYYY-MM-dd HH:mm:ss" ]
                                target => "endtime"
                        }
}

output {
        stdout {
                codec => rubydebug
        }
        #if ![tags] {
                elasticsearch {
                        hosts => ["10.177.219.149:9200"]
                        index => "mrtf_index"
                        user => "elastic"
                        password => "mrkibana"
                        #sniffing => true
                        #manage_template => true
                }
        #} else {
        #       file {
        #               path => "/home/sdc/kibana_data/std_mrtf.log"
        #               codec => rubydebug
        #       }
        #}
}

(Christian Dahlqvist) #2

You seem to have an unescaped " in your pattern that throws off parsing of the config file (which you can see from the inconsistent colour coding).


(Subash) #3

thanks for replying. The same is issue after updating it....

Config

input {
        #stdin { }
        file {
                path => "/home/sdc/kibana_test.json"
                start_position => "beginning"
                sincedb_path => "/dev/null"
        #        codec => "json"
                codec => multiline {
                        pattern => "^(\n+|\{")""
                        what => "previous"
                        negate => "true"
                        auto_flush_interval => 1
                        max_lines => 10000000
                        max_bytes => "200 MiB"
                }
        }
}

Erorr

   ] Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"LogStash::ConfigurationError", :message=>"Expected one of #, {, } at line 9, column 25 (byte 210) after input {\n\t#stdin { }\n\tfile {\n\t\tpath => \"/home/sdc/kibana_test.json\"\n\t\tstart_position => \"beginning\"\n\t\tsincedb_path => \"/dev/null\"\n        #        codec => \"json\"\n\t\tcodec => multiline {\n\t\t\tpattern => \"^(\\n+|\\{\"", :backtrace=>["/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:42:in `compile_imperative'", "/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:50:in `compile_graph'", "/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:12:in `block in compile_sources'", "org/jruby/RubyArray.java:2486:in `map'", "/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:11:in `compile_sources'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:51:in `initialize'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:171:in `initialize'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline_action/create.rb:40:in `execute'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:335:in `block in converge_state'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:141:in `with_pipelines'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:332:in `block in converge_state'", "org/jruby/RubyArray.java:1734:in `each'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:319:in `converge_state'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:166:in `block in converge_state_and_update'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:141:in `with_pipelines'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:164:in `converge_state_and_update'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:105:in `block in execute'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/stud-0.0.23/lib/stud/interval.rb:18:in `interval'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:94:in `execute'", "/usr/share/logstash/logstash-core/lib/logstash/runner.rb:343:in `block in execute'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/stud-0.0.23/lib/stud/task.rb:24:in `block in initialize'"]}

(Christian Dahlqvist) #4

That is still not correct. The pattern is now 2 strings with a ) between them. You need to escape the " in the string, possibly like this: "^(\n+|\{\")".


(Subash) #5

thanks again, my json format is below. somehow it is not working for me. Could you please help me to define the pattern for this json format. Thanks in advance.

[

{
        "runid": 22759,
        "starttime": "2018-02-19 12:41:01",
        "endtime": "2018-02-19 13:14:03",
        "duration": 1982.695648,
        "context": "CI_MR27",
        "workspace": "",
        "changelist": 0,
        "username": "shiny",
        "serverid": 9,
        "configfile": "mrtf/config/ci/MR27_SmokeTest.json",
        "testname": "testset.SmokeTest_MR27",
        "testpathid": 5920,
        "resultrootid": 23275,
        "statusid": 3,
        "resultid": 2,
        "baselinechangelist": null
}, {
        "runid": 22758,
        "starttime": "2018-02-19 11:37:27",
        "endtime": "2018-02-19 12:05:47",
        "duration": 1700.233019,
        "context": "CI_CX_NEW",
        "workspace": "",
        "changelist": 0,
        "username": "shiny",
        "serverid": 9,
        "configfile": "mrtf/config/ci/450w_Cx_KPI.json",
        "testname": "testset.Cx_KPI",
        "testpathid": 15528,
        "resultrootid": 23274,
        "statusid": 3,
        "resultid": 6,
        "baselinechangelist": null
}, {
        "runid": 22757,
        "starttime": "2018-02-19 11:37:06",
        "endtime": "2018-02-19 12:13:50",
        "duration": 2203.609581,
        "context": "CI_CX_NEW",
        "workspace": "",
        "changelist": 0,
        "username": "shiny",
        "serverid": 9,
        "configfile": "mrtf/config/ci/Concerto_SmokeTest_T12.json",
        "testname": "testset.Cx_KPI",
        "testpathid": 15528,
        "resultrootid": 23273,
        "statusid": 3,
        "resultid": 6,
        "baselinechangelist": null
}]

(Subash) #6

someone can help me?


(Christian Dahlqvist) #7

As you have lines ( }, { ) that belongs to two events, I think you need to create a multiline pattern that captures all of that into one event. How long can these arrays be?

You should then be able to use a json filter (with a target field defined) to parse it. After that you can use a split filter to split the parsed array into multiple events.


(Subash) #8

Thanks again...
these arrays can be 10000.
Could you please provide the sample code for json filter?


(Christian Dahlqvist) #9

That is probably too much to fit into a single event for parsing like that, so you will need to find a way to extract the events individually. I am not sure how to best do that though.


(system) #10

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.