Arabic and special characters

Dear Team,

kindly note that when we stashing to Logstash with data contains Arabic or Special characters , it showing to HTML encoded text ,
Example :

Ma’aden (original text) ---->Ma’aden (elastic log)

الرياض (original text) ----->
الرياض (elastic log)

kindly your support to have it as original text.

Also , please find below my configuration:-

input {
 http {
	host => "x.x.x.x"
	port => 3131
    }
}
filter{
	mutate {
	remove_field => ["body","Log"]
    }
xml {
        source => "message"
	target => "doc"
	remove_namespaces => true
	force_array => false
	remove_field => ["[doc][logMessage]" , "[doc][gtxnID][keyword]" , "[headers][connection]" , "[headers][content_length]" , "[headers][content_type]" , "[headers][http_host]" , "[headers][http_version]" , "[headers][request_method]" , "[headers][request_path]"]
      }
}
output {
  elasticsearch { hosts => ["x.x.x.x:9200"] 
                   user => "elastic"
                   password => "xxxx"
                   index => "datapower"
    }
  stdout 
	{ 
        codec => plain
    }
}

Best regards,

From Kibana to Logstash

Dear @carly.richmond ,

for your support please

Hi @abdullah144,

On the Elasticsearch side, can you check with analyzer is being used by your index? Is it the Arabic analyzer?

Dear @carly.richmond ,

you mean the below ?

Dear @Badger ,

your usual support please

Thanks,