Parse json file with logstash

hey

I've following json file

{
"time":"2015-09-20;12:13:24",
"bug_code":"test",
"stacktrace":1235
}

I got this from a python script and logstash reading it out with following configuration:

    input {
 file{
	
 	path => "D:\logstash_pipeline\test.json"
	start_position => "beginning"
	sincedb_path => "var/log/kibana/kibana.stdout" # is this right?
	codec => "json"
		}
}


output {
  stdout{
     codec => json # for debugging
  }
  
  file {
			path => "logs/log.txt"
			codec => json_lines
	}
	
elasticsearch {
    hosts => ["localhost:9200"]
	index => "json_test"
    document_type => "jenkins_perfReport"
  }
 
  if "_jsonparsefailure" in [tags] {
		file {
		codec => json_lines
			path => "logs/_jsonparsefailure.txt"
			codec => json_lines
	}
			}
		stdout{
		}
	
}


{"path":"D:\\logstash_pipeline\\test.json","@version":"1","@timestamp":"2018-06-06T12:10:13.512Z","message":"\"bug_code\":\"test\",\r","host":"bla","tags":["_jsonparsefailure"]}
{"path":"D:\\Jlogstash_pipeline\\test.json","@version":"1","@timestamp":"2018-06-06T12:10:13.512Z","message":"\"stacktrace\":1235\r","host":"bla","tags":["_jsonparsefailure"]}
{"path":"D:\\logstash_pipeline\\test.json","@version":"1","@timestamp":"2018-06-06T12:10:13.466Z","message":"{\r","host":"bla","tags":["_jsonparsefailure"]}
{"path":"D:\\logstash_pipeline\\test.json","@version":"1","@timestamp":"2018-06-06T12:10:13.497Z","message":"\"time\":\"2015-09-20;12:13:24\",\r","host":"bla","tags":["_jsonparsefailure"]}

I thought that I will see in kibana the information like
bug_code: test
strackrace: 1235

Questions?
1.) Shouldnt the codec be json_lines? If I use it there is no result? Do I've to install json_lines?
2.) How do I've to mutate the field to have to result how I think?
3.) How can I add a counter that increment an id => for every

{
"time":"2015-09-20;12:13:24",
"bug_code":"test",
"stacktrace":1235
}

Does the file look exactly like this, i.e. is the JSON message spread out over multiple lines? Does a file contain multiple such messages?

sincedb_path => "var/log/kibana/kibana.stdout" # is this right?

No, that doesn't really make sense. Technically it probably works but the path is at best misleading in its name.

3.) How can I add a counter that increment an id => for every

Why do you want to do that?

Thank you very much for your response - It don't have to look like this. If put it together in one line I still no result in Kibana... nothing happen in logstash.

I chagend the sincedb_path to "/dev/null"

Why do you want to do that?
I want to add a pie chart with the last information of a specific build - if I use the information it just add all passed test together,

I chagend the sincedb_path to "/dev/null"

On Windows use "nul", not "/dev/null".

Increasing the loglevel could give clues about what's going on.

So I changed my input to:
input {
file{

 	path => "C:\Users\test\data.json" # also tried ["C:\Users\test\data.json"]
	start_position => "beginning"
	sincedb_path => "NUL" 
	codec => "json" # Is codec a string?
	
		}

}

I

and

   output {
      stdout{
         codec => "json"
      }
  
  file {
			path => "logs/log.txt"
			codec => "json"
	}
	
elasticsearch {
    hosts => ["localhost:9200"]
	index => "json_test"
    document_type => "jenkins_perfReport"
  }
 
  if "_jsonparsefailure" in [tags] {
		file {
		codec => json_lines
			path => "logs/_jsonparsefailure.txt"
			codec => "json"
		}
	}

}

The complete Log is to long to post -when I read the log I see this as problem description:

:exception=>"LogStash::ConfigurationError", :message=>"Expected one of #, input, filter, output at line 1, column 1 (byte 1) after "

my conf file starts in line1 - I use notepad++ - I've no filter so far...

Make sure you don't have any garbage characters (like a byte-order mark) at the very beginning of the file.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.