Aggregate Json input

Hello everyone,
I am trying to aggregate JSON input I harvest through filebeat,
This is my application log file:

}{
  "timestamp" : "2022-05-16 15:47:40",
  "level" : "ERROR",
  "thread" : "main",
  "logger" : "com.crossix.safemine.cloud.SMCFlow",
  "message" : "[reduce-jane-phi-no-stats_192_52_1652714359200]Shachar tests exception1111 - Fail to execute SMCFlow",
  "context" : "default",
  "exception" : "java.lang.ArithmeticException: / by zero\n\tat com.crossix.safemine.cloud.SMCFlow.process(SMCFlow.java:102)\n\tat com.crossix.safemine.cloud.SMCFlow.execute(SMCFlow.java:77)\n\tat com.crossix.safemine.cloud.SMCApplication.run(SMCApplication.java:114)\n\tat com.crossix.safemine.cloud.SMCApplication.main(SMCApplication.java:32)\n"
}{
  "timestamp" : "2022-05-16 15:47:40",
  "level" : "ERROR",
  "thread" : "main",
  "logger" : "com.crossix.safemine.cloud.SMCFlow",
  "message" : "[reduce-jane-phi-no-stats_192_52_1652714359200] Fail SMCFlow on(1) ",
  "context" : "default"
}

And this is my filebeat configuration:

processors:
  - drop_fields:
      fields: ["input.type","ecs.version","agent.version","agent.type","tags","agent.name","log.offset","_score","_type","_id","_index","agent.id","agent.hostname","agent.ephemeral_id","@version"]
  - add_fields:
      target: Environment
      fields:
	name: 'Spark-Cluster-bravo'

filebeat.inputs:
- type: filestream
  enabled: true
  paths:
    - '/mnt/nfs/var/nfsshare/logs/Spark-Cluster-bravo/reduce-component.log'
  close_inactive: 10m

  json.message_key: message
  json.keys_under_root: true
  json.add_error_key: true
  fields_under_root: true
  multiline.pattern: '^{'
  multiline.negate: true
  multiline.match: after

filebeat.config:
  modules:
    path: ${path.config}/modules.d/*.yml
    reload.enabled: false

logging.level: info
logging.to_files: true
logging.files:
  path: /var/log/filebeat
  name: filebeat
  keepfiles: 7
  permissions: 0644

output.logstash:
  hosts: ["10.192.XX.XXX:5044"]
  index: "filebeat-spark"

Right now I get each line as a separate message and what I expect is to get the whole element to be in one message instead.

like that:

{
  "timestamp" : "2022-05-16 15:47:40",
  "level" : "ERROR",
  "thread" : "main",
  "logger" : "com.crossix.safemine.cloud.SMCFlow",
  "message" : "[reduce-jane-phi-no-stats_192_52_1652714359200] Fail SMCFlow on(1) ",
  "context" : "default"
}

The filebeat log doesn't show any errors so I am wondering if any of the configurations I added is even working.

And this is how I see the results in OpenSearch:

Thanks in advance

OpenSearch/OpenDistro are AWS run products and differ from the original Elasticsearch and Kibana products that Elastic builds and maintains. You may need to contact them directly for further assistance.

(This is an automated response from your friendly Elastic bot. Please report this post if you have any suggestions or concerns :elasticheart: )

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.