Json logstash

Hi all

i have a json file and im trying to parse this thru logstash and into elasticsearch and kibana.

I have the following config.

input {
  file {
    path => "/appdir/logs/batch-timing.json"
    type => "dbatch" } }

filter {
  if [type] == "dbatch" {
    json {
      source => "message" } } 

output {
  if [type] == "dbatch" {
                elasticsearch {
                        index => "app-dbatch"
                        hosts => [ "11.111.21.374:9200" ]
                        user => "logstash-USER"
                        password => "PASS123" } } }

i tried to parse in the json file but it doesnt read it so i added a new line after the json message, and kibana only shows the new line which results in a jsonparsefailure. i then proceeded to append the same json message into the log file but still kibana does not show anything except the new line.

I think something is wrong with the way im configuring the output?

Help would be greatly appreciated.

thanks

What does the log file look like? Is the entire JSON document in a single line?

hey thanks for the reply.

yes it is one single line

{"data":[["Command Line","LOADGLOBAL",null,"1","2017/06/08 16:50:57","2017/06/08 16:51:01","success"],["Command Line","MXDATES",null,"1","2017/06/08 16:51:14","2017/06/08 16:51:15","success"],["Command Line","test/test_bcp",null,"1","1970/01/01 08:00:00","1970/01/01 08:00:00","failed"],["Command Line","EXE","echo","1","2017/06/08 17:03:57","2017/06/08 17:03:58","success"],["test_email","echo",">","1","2017/06/08 17:07:32","2017/06/08 17:07:33","success"],["test_email","echo",">>","1","2017/06/08 17:07:37","2017/06/08 17:07:38","success"],["test_email","EMAIL",null,"1","2017/06/08 17:07:42","2017/06/08 17:07:43","success"],["Command Line","TRANSFER","WashTrade_20170608.csv","0","2017/06/08 17:10:08","2017/06/08 17:10:09","failed"],["test_script","MxSQLQuery.sh","/murex/UAT313/cfg/MxSQLQuery_WashTrade.cfg","0","2017/06/08 17:08:25","2017/06/08 17:10:11","failed"],["Command Line","MXDATES",null,"1","2017/06/09 10:27:42","2017/06/09 10:27:43","success"]]}

Basically, it is an json object, which contains a single key called ‘data’. The data key is an array of arrays. Each array contains the following fields:

[Job, Type, Argument, Weight, Start Time, End Time, Status]

It looks like it is malformed as there is a missing square brace towards the end. There structure also does not look very suitable for Elasticsearch, so you may need to reformat it if you want to be able to search or aggregate on it in any meaningful way, possibly by using the split filter to separate it into multiple events and the populate the fields you listed.

Sorry i tried to make the log shorter and seem to have caused the malformation.
(have checked and added the square bracket}

wad do u mean by the structure is not suitable for elasticsearch?
how should i reformat it to make it more suitable for elasticsearch?

Thanks

An array of values will make it difficult to search for a specific component in the array, so I suspect it may be better to parse out the different parts into either multiple documents or a nested one.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.