Thanks for the help. It is wring the error as below after logstash is started
[2020-03-19T17:31:23,482][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
[2020-03-19T17:31:23,618][ERROR][logstash.codecs.json ] JSON parse error, original data now in message field {:error=>#<LogStash::Json::ParserError: Unexpected end-of-input: expected close marker for Object (start marker at [Source: (String)"[ {"; line: 1, column: 3])
at [Source: (String)"[ {"; line: 1, column: 7]>, :data=>"[ {"}
[2020-03-19T17:31:23,666][ERROR][logstash.codecs.json ] JSON parse error, original data now in message field {:error=>#<LogStash::Json::ParserError: incompatible json object type=java.lang.String , only hash map or arrays are supported>, :data=>" \"metricId\" : 1776736,"}
[2020-03-19T17:31:23,669][ERROR][logstash.codecs.json ] JSON parse error, original data now in message field {:error=>#<LogStash::Json::ParserError: incompatible json object type=java.lang.String , only hash map or arrays are supported>, :data=>" \"metricName\" : \"Hardware Resources|Memory|Used %\","}
[2020-03-19T17:31:23,671][ERROR][logstash.codecs.json ] JSON parse error, original data now in message field {:error=>#<LogStash::Json::ParserError: incompatible json object type=java.lang.String , only hash map or arrays are supported>, :data=>" \"metricPath\" : \"Application Infrastructure Performance|Root|Individual Nodes|FBLQAMT|Hardware Resources|Memory|Used %\","}
[2020-03-19T17:31:23,673][ERROR][logstash.codecs.json ] JSON parse error, original data now in message field {:error=>#<LogStash::Json::ParserError: incompatible json object type=java.lang.String , only hash map or arrays are supported>, :data=>" \"frequency\" : \"TEN_MIN\","}
After many errors i see the below
[2020-03-19T17:31:26,032][ERROR][logstash.codecs.json ] JSON parse error, original data now in message field {:error=>#<LogStash::Json::ParserError: incompatible json object type=java.lang.String , only hash map or arrays are supported>, :data=>" \"useRange\" : true,"}
{
"path" => "H:/ELK/AppD_Data/FBLQAFeb24_50uMem.json",
"host" => "PVDI-FCBT081",
"tags" => [
[0] "_jsonparsefailure"
],
"message" => "[ {",
"@timestamp" => 2020-03-19T12:01:23.634Z,
"@version" => "1"
}
{
"path" => "H:/ELK/AppD_Data/FBLQAFeb24_50uMem.json",
"host" => "PVDI-FCBT081",
"tags" => [
[0] "_jsonparsefailure"
],
"message" => " \"metricPath\" : \"Application Infrastructure Performance|Root|Individual Nodes|FBLQAMT|Hardware Resources|Memory|Used %\",",
"@timestamp" => 2020-03-19T12:01:23.672Z,
"@version" => "1"
}
It is writing each line in the message field. I need each field in the data to be seperate
in Kibana for analysis.
I have mentioned the input JSON file at the top. which looks like below.
[ {
"metricId" : 1776736,
"metricName" : "Hardware Resources|Memory|Used %",
"metricPath" : "Application Infrastructure Performance|Root|Individual Nodes|FBLQAMT|Hardware Resources|Memory|Used %",
"frequency" : "TEN_MIN",
"metricValues" : [ {
"startTimeInMillis" : 1582555800000,
"occurrences" : 1,
"current" : 25,
"min" : 25,
"max" : 25,
"useRange" : true,
"count" : 20,
"sum" : 500,
"value" : 25,
"standardDeviation" : 0
}, {
"startTimeInMillis" : 1582556400000,
....
}]
}, {
"metricId" : 1776736,
"metricName" : "Hardware Resources|Memory|Used %",
"metricPath" : "Application Infrastructure Performance|Root|Individual Nodes|FBLTESTUXP|Hardware Resources|Memory|Used %",
"frequency" : "TEN_MIN",
"metricValues" : [ {
"startTimeInMillis" : 1582555800000,
.....
} ]
} ]