Parsing a pretty formatted JSON file using Logstash

Hi,

I am pretty new to Elastic stack. I have a need to index a JSON file data. Below is my JSON file format.
[ {
"metricId" : 1776736,
"metricName" : "Hardware Resources|Memory|Used %",
"metricPath" : "Application Infrastructure Performance|Root|Individual Nodes|FBLQAMT|Hardware Resources|Memory|Used %",
"frequency" : "TEN_MIN",
"metricValues" : [ {
"startTimeInMillis" : 1582555800000,
"occurrences" : 1,
"current" : 25,
"min" : 25,
"max" : 25,
"useRange" : true,
"count" : 20,
"sum" : 500,
"value" : 25,
"standardDeviation" : 0
}, {
"startTimeInMillis" : 1582556400000,
....
}]
}, {
"metricId" : 1776736,
"metricName" : "Hardware Resources|Memory|Used %",
"metricPath" : "Application Infrastructure Performance|Root|Individual Nodes|FBLTESTUXP|Hardware Resources|Memory|Used %",
"frequency" : "TEN_MIN",
"metricValues" : [ {
"startTimeInMillis" : 1582555800000,
.....
} ]
} ]

Can anyone please provide me a logstash configuration file for the above JSON file?

Try this

input { 
	file {

		start_position => "beginning"		
		path => "pathOfYourJsonFile"	
		sincedb_path => "/dev/null"
		codec => "json"
	}
}

output {
	elasticsearch {						
		hosts => ["Ip_Address_of_Elasticsearch:Port_No"] # localhost:9200
		index =>"Index_Name"
  	}
}

Thank you for the reply. I tried the below configuration.

input {  
file 
{
    path => "H:/ELK/AppD_Data/FBLQAFeb24_50uMem.json"
    start_position => "beginning"
	sincedb_path => "nul"
	codec => json
  }
}

filter 
{

}


output {
	stdout { codec => rubydebug }
#	elasticsearch {
#		hosts => ["localhost:9200"]
#		index => "appdynamics_metrics2"
#		
#	}
#	
}

I am getting the below error.

[2020-03-19T13:54:12,274][ERROR][logstash.codecs.json     ] JSON parse error, original data now in message field {:error=>#<LogStash::Json::ParserError: Unexpected end-of-input: expected close marker for Object (start marker at [Source: (String)"[ {"; line: 1, column: 3])
 at [Source: (String)"[ {"; line: 1, column: 7]>, :data=>"[ {"}

I need each nested fields to be fields in the output. Any suggestions?

Hi Can you show me the message field ? If possible also show the contents of the file. In file, every field of JSON object is written in new line like you have showed it above

Thanks for the help. It is wring the error as below after logstash is started

[2020-03-19T17:31:23,482][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}
[2020-03-19T17:31:23,618][ERROR][logstash.codecs.json     ] JSON parse error, original data now in message field {:error=>#<LogStash::Json::ParserError: Unexpected end-of-input: expected close marker for Object (start marker at [Source: (String)"[ {"; line: 1, column: 3])
 at [Source: (String)"[ {"; line: 1, column: 7]>, :data=>"[ {"}
[2020-03-19T17:31:23,666][ERROR][logstash.codecs.json     ] JSON parse error, original data now in message field {:error=>#<LogStash::Json::ParserError: incompatible json object type=java.lang.String , only hash map or arrays are supported>, :data=>"  \"metricId\" : 1776736,"}
[2020-03-19T17:31:23,669][ERROR][logstash.codecs.json     ] JSON parse error, original data now in message field {:error=>#<LogStash::Json::ParserError: incompatible json object type=java.lang.String , only hash map or arrays are supported>, :data=>"  \"metricName\" : \"Hardware Resources|Memory|Used %\","}
[2020-03-19T17:31:23,671][ERROR][logstash.codecs.json     ] JSON parse error, original data now in message field {:error=>#<LogStash::Json::ParserError: incompatible json object type=java.lang.String , only hash map or arrays are supported>, :data=>"  \"metricPath\" : \"Application Infrastructure Performance|Root|Individual Nodes|FBLQAMT|Hardware Resources|Memory|Used %\","}
[2020-03-19T17:31:23,673][ERROR][logstash.codecs.json     ] JSON parse error, original data now in message field {:error=>#<LogStash::Json::ParserError: incompatible json object type=java.lang.String , only hash map or arrays are supported>, :data=>"  \"frequency\" : \"TEN_MIN\","}

After many errors i see the below

[2020-03-19T17:31:26,032][ERROR][logstash.codecs.json     ] JSON parse error, original data now in message field {:error=>#<LogStash::Json::ParserError: incompatible json object type=java.lang.String , only hash map or arrays are supported>, :data=>"    \"useRange\" : true,"}
{
          "path" => "H:/ELK/AppD_Data/FBLQAFeb24_50uMem.json",
          "host" => "PVDI-FCBT081",
          "tags" => [
        [0] "_jsonparsefailure"
    ],
       "message" => "[ {",
    "@timestamp" => 2020-03-19T12:01:23.634Z,
      "@version" => "1"
}
{
          "path" => "H:/ELK/AppD_Data/FBLQAFeb24_50uMem.json",
          "host" => "PVDI-FCBT081",
          "tags" => [
        [0] "_jsonparsefailure"
    ],
       "message" => "  \"metricPath\" : \"Application Infrastructure Performance|Root|Individual Nodes|FBLQAMT|Hardware Resources|Memory|Used %\",",
    "@timestamp" => 2020-03-19T12:01:23.672Z,
      "@version" => "1"
}

It is writing each line in the message field. I need each field in the data to be seperate
in Kibana for analysis.
I have mentioned the input JSON file at the top. which looks like below.

[ {
"metricId" : 1776736,
"metricName" : "Hardware Resources|Memory|Used %",
"metricPath" : "Application Infrastructure Performance|Root|Individual Nodes|FBLQAMT|Hardware Resources|Memory|Used %",
"frequency" : "TEN_MIN",
"metricValues" : [ {
"startTimeInMillis" : 1582555800000,
"occurrences" : 1,
"current" : 25,
"min" : 25,
"max" : 25,
"useRange" : true,
"count" : 20,
"sum" : 500,
"value" : 25,
"standardDeviation" : 0
}, {
"startTimeInMillis" : 1582556400000,
....
}]
}, {
"metricId" : 1776736,
"metricName" : "Hardware Resources|Memory|Used %",
"metricPath" : "Application Infrastructure Performance|Root|Individual Nodes|FBLTESTUXP|Hardware Resources|Memory|Used %",
"frequency" : "TEN_MIN",
"metricValues" : [ {
"startTimeInMillis" : 1582555800000,
.....
} ]
} ]

Hi, @Noushad_Aslam

I wonder if attempting something like this would work.

The "multiline" plugin may also be able to solve your problem:
https://www.elastic.co/guide/en/logstash/current/plugins-codecs-multiline.html

Best of luck!

@jbpratt I tried the gsub but still each line is written in the message field.
removed json codec and used json filter, still same problem.
Can you please provide a correct logstash configuration file for the above mentioned JSON file input?
The field "metricValues" is an array of nested fields like "startTimeInMillis", "occurrences" etc..

Thanks in advance.

If you want to consume the entire file as a single event you must use a multiline codec. See this post for an example.