Logstash throwing error .Unexpected character (‘<’ (code 60))

Hi team,
I am using ELK on the cloud. Using filebeat to send data to logstash. Logstash sends to ES which pushes it in Kibana.It appears most of the data is properly parsed and sent to Kibana successfully. There are instance now where i find few data's are missing in Kibana. I am getting json parse error though JSON body looks good and validated.

I see logstash error as below:

Error parsing json {:source=>"message", :raw=>"{"timestamp":"2019-08-13T07:10:10.713Z", "type":"CacheRedis", "name":"insertWatsonRecord", "body":{"operation":"insertWatsonRecord", "table":"watsonIntentAndEntity", "query":{"intents":<Java::JavaUtil::ArrayList:-302032833 [{"intent":"greet", "confidence":1}]>, "entities":<Java::JavaUtil::ArrayList:1 >, "input":{"text":"hi"}, "output":{"nodes_visited":<Java::JavaUtil::ArrayList:1 >, "warning":"No dialog node condition matched to true in the last dialog round - context.nodes_visited is empty. Falling back to the root node in the next round.", "log_messages":<Java::JavaUtil::ArrayList:1910241773 [{"level":"warn", "msg":"No dialog node condition matched to true in the last dialog round - context.nodes_visited is empty. Falling back to the root node in the next round."}]>, "generic":<Java::JavaUtil::ArrayList:1 >, "text":<Java::JavaUtil::ArrayList:1 >}, "context":{"conversation_id":"97dee768-da32-490a-aa79-0e103df828aa", "system":{"dialog_request_counter":1, "initialized":true, "dialog_stack":<Java::JavaUtil::ArrayList:22214298 [{"dialog_node":"root"}]>, "dialog_turn_counter":1}}}, "startTime":1565680210701}, "duration":12, "success":true, "itemType":"dependency"}", :exception=>#

<LogStash::Json::slight_smile: ParserError: Unexpected character ('<' (code 60)): expected a valid value (number, String, array, object, 'true', 'false' or 'null')
at [Source: (byte)"{"timestamp":"2019-08-13T07:10:10.713Z", "type":"CacheRedis", "name":"insertWatsonRecord", "body":{"operation":"insertWatsonRecord", "table":"watsonIntentAndEntity", "query":{"intents":<Java::JavaUtil::ArrayList:-302032833 [{"intent":"greet", "confidence":1}]>, "entities":<Java::JavaUtil::ArrayList:1 >, "input":{"text":"hi"}, "output":{"nodes_visited":<Java::JavaUtil::ArrayList:1 >, "warning":"No dialog node condition matched to true in the last dialog round - context.nodes_visited is empty."[truncated 686 bytes]; line: 1, column: 187]>}

Filebeat log:-
I verified the json body present in filebeat and it seems to be valid .
{
"timestamp":"2019-08-12T12:39:25.255Z",
"type":"CacheRedis",
"name":"insertWatsonRecord",
"body":"{\n "operation": "insertWatsonRecord",\n "table": "watsonIntentAndEntity",\n "query": "{\"intents\":[{\"intent\":\"greet\",\"confidence\":1}],\"entities\":,\"input\":{\"text\":\"hi\"},\"output\":{\"generic\":,\"text\":,\"nodes_visited\":,\"warning\":\"No dialog node condition matched to true in the last dialog round - context.nodes_visited is empty. Falling back to the root node in the next round.\",\"log_messages\":[{\"level\":\"warn\",\"msg\":\"No dialog node condition matched to true in the last dialog round - context.nodes_visited is empty. Falling back to the root node in the next round.\"}]},\"context\":{\"timezone\":\"America/New_York\",\"conversation_id\":\"cd40ae3c-f12d-4c5a-aa71-0bfd2c05e984\",\"system\":{\"initialized\":true,\"dialog_stack\":[{\"dialog_node\":\"root\"}],\"dialog_turn_counter\":1,\"dialog_request_counter\":1}}}",\n "startTime": 1565613565251\n}",
"duration":4,
"success":true,
"itemType":"dependency"
}

Here is my logstash.yml

logstash.conf: |
# Sample Logstash configuration for creating a simple
# Beats -> Logstash -> Elasticsearch pipeline.

input {
  beats {
   client_inactivity_timeout => 3000
   port => 5044
   ssl  => false
        }
  
      }
      
filter {

  **mutate {**

** gsub => [**
** "message", '=>', ':',**
** "message", ':,', ':,'**
** ]**
** }**

json {
       source => "message"
  }
}

output {
  elasticsearch {
  hosts => ["*******"]
  index => "max-log-%{+YYYY.MM.dd}"
  user => "****"
  password => "***"
  cacert => [ "***" ]
  ssl => true

} 

stdout {
  codec => json
  }

Here is filebeat:
data:
filebeat.yml: |-
filebeat.autodiscover:
providers:
- type: kubernetes
include_pod_uid: true
in_cluster: true
hints.enabled: true
include_annotations: '*'
templates:
- condition.regexp:
kubernetes.container.name: '{{ condition.kubernetescontainername }}' config: - type: docker combine_partial: true cri.parse_flags: true cri.force: true containers: path: "{{ .Values.containers.logpath1 }}" ids: - "{data.kubernetes.container.name}"

                path: "{{ .Values.containers.logpath2 }}"
              
processors:
  - add_kubernetes_metadata:
      in_cluster: true
  - decode_json_fields:
      fields: ["message"]
      process_array: true
      max_depth: 8

output.logstash:
  hosts: ["{{ $creds.servicename }}.{{ $creds.namespace }}.svc.cluster.local:{{ $creds.port }}"]

Is it something with mutate in logstash.yml ? Kindly correct me if anything is wrong here.I assume it is because of array patterns present in my json body. Here is pattern for which it is failing continuously.

{
"timestamp":"2019-08-12T12:39:19.384Z",
"name":"AnalyticsWorkflow",
"body":"{\n "header": {\n "user": "98ab7383281d0f2892c297ef99fc5634",\n "fullname": "b738bb2e7d5b69c73f96c6b9c629b707",\n "fragmentID": 0,\n "channel": "directline",\n "tenant": "ibmqa",\n "startTime": "2019-08-12T12:39:17.142Z",\n "bot": {\n "Name": "IBMCloudQA"\n }\n },\n "workflow": {\n "step": [\n {\n "value": "tips",\n "message": "{\"conversationIntent\":\"greet\",\"subIntent\":\"greetTips\"}",\n "elapsedTime": 1,\n "time": "2019-08-12T12:39:17.143Z"\n }\n ]\n },\n "nlu": {\n "userText": "hi",\n "intent": {\n "name": "greet",\n "confidence": 1\n },\n "entities": {}\n },\n "endEvent": "clearConversation"\n}",
"itemType":"customEvent"
}

That's not valid JSON. The json filter is complaining when it reaches the <

hi it is valid,just that while pasting it is not taking proper format

{
	"timestamp": "2019-08-12T12:39:19.384Z",
	"name": "AnalyticsWorkflow",
	"body": "{\n  \"header\": {\n    \"user\": \"98ab7383281d0f2892c297ef99fc5634\",\n    \"fullname\": \"b738bb2e7d5b69c73f96c6b9c629b707\",\n    \"fragmentID\": 0,\n    \"channel\": \"directline\",\n    \"tenant\": \"ibmqa\",\n    \"startTime\": \"2019-08-12T12:39:17.142Z\",\n    \"bot\": {\n      \"Name\": \"IBMCloudQA\"\n    }\n  },\n  \"workflow\": {\n    \"step\": [\n      {\n        \"value\": \"tips\",\n        \"message\": \"{\\\"conversationIntent\\\":\\\"greet\\\",\\\"subIntent\\\":\\\"greetTips\\\"}\",\n        \"elapsedTime\": 1,\n        \"time\": \"2019-08-12T12:39:17.143Z\"\n      }\n    ]\n  },\n  \"nlu\": {\n    \"userText\": \"hi\",\n    \"intent\": {\n      \"name\": \"greet\",\n      \"confidence\": 1\n    },\n    \"entities\": {}\n  },\n  \"endEvent\": \"clearConversation\"\n}",
	"itemType": "customEvent"
}

My json object contains array and logstash not able to read /parse the same.I am not getting issue for normal bodies but there should be way where logstash can parse the json correctly if it contains multi array .

Any suggestion here would be appreciated.

Again, this is not valid JSON. If you edit your post, select the JSON, and click on </> it may then display valid JSON. Or not.

i have verified on JSONlint. pls see it saying valid json. i will re-check but we are parsing the body and validating before sending to filebeat.

OK, that is valid JSON and a json filter will parse it. What problem are you having?

Same error which i posted earlier....unexpected character . same as below:
Error parsing json {:source=>"message", :raw=>"{"name":"Redis", "body":{"operation":"insertWatsonRecord", "table":"watsonIntentAndEntity", "query":{"intents":<Java::JavaUtil::ArrayList:-302032833 [{"confidence":1, "intent":"greet"}]>, "entities":<Java::JavaUtil::ArrayList:1

Can you please help me with this settings.If anything is incorrect here. Am pushing data from kubernetes.pod.log to filebeat,then logstash and further.
processors:
- add_kubernetes_metadata:
in_cluster: true
- decode_json_fields:
fields: ["message"]
process_array: true
max_depth: 8

And once again - that is not valid JSON and you should not expect a json filter to parse it.

hey i was away from work for some time. As i mentioned earlier,i tested the json body and it looks good to me.
One question,is Kubernetes adds any metadata to file beat or log stash while proccessing the JSON body?
Application is creating proper json (i verified them on multiple times). This JSON goes to POD consoles and gets dumped into log files generated by kubernetes container. These log files are input to filebeats and logstash.
Do you think that kubernetes container adds any metadata to JSON body while building the log files?
PS: if i run this setup on local(excludes Kubernetes) it work absolutely fine and i do not see any JSON parse errors.
Any suggestions would be great here.