Hi,
I'm using ELasticsearch as docker . I havent created mapping indexes there .
I'm getting following exception
org.apache.spark.SparkException: Job aborted due to stage failure: Task 43 in stage 516846.0 failed 4 times, most recent failure: Lost task 43.3 in stage 516846.0 (TID 5945113, 10.244.18.196, executor 4): org.apache.spark.util.TaskCompletionListenerException: Could not write all entries for bulk operation [1/2]. Error sample (first [5] error messages):
org.elasticsearch.hadoop.rest.EsHadoopRemoteException: mapper_parsing_exception: failed to parse;org.elasticsearch.hadoop.rest.EsHadoopRemoteException: illegal_argument_exception: object field starting or ending with a [.] makes object resolution ambiguous: [../../../../../../../../../../../../../../../etc/passwd]
{"index":{"_id":"435154|3189f8ce-fad5-44c6-ad7c-ec6c66679aa3|WEB|/apply/application"}}
{"service_id":"3189f8ce-fad5-44c6-ad7c-ec6c66679aa3","src_group":"WEB","src_id":"/apply/application","txn_count":6,"destination_map":{"/var/log":1,"cp /etc/passwd /tmp/":1,"http://localhost:8281/http/get":2,"../../../../../../../../../../../../../../../etc/passwd":1,"<?xml version=\"1.0\" encoding=\"ISO-8859-1\"?><!DOCTYPE foo [ ]>&xxe;":1},"ss_map":{"SSRF":2,"DIRECTORY":1,"READFILE":2,"COMMAND":1},"app_id":"d2f5d24d-89fc-49b9-8237-83e844a147fe","interval":18131,"interval_type":"DAILY","doc_id":"435154|3189f8ce-fad5-44c6-ad7c-ec6c66679aa3|WEB|/apply/application","hour_interval":0}
What could be best solution?