Extra chars in kibana message field

Hello,

Do you know how to deal with these extra chars which don't allow to parse log entries properly?

elasticsearch log format contains extra chars like [0;39m, e[33m :

e[2m2019-09-11 20:45:43.600e[0;39m e[33m WARNe[0;39m e[35m27e[0;39m e[2m---e[0;39m e[2m[nio-8070-exec-9]e[0;39m e[36me.i.k.k.v.validator.Validator    e[0;39m e[2m:e[0;39m #/verifiableCredential/5/credentialSubject: required key [id] not found

kubernetes stdout log output format looks pretty clean:

2019-09-11 00:22:53.938  WARN 27 --- [nio-8060-exec-9] o.z.p.spring.web.advice.App      : Not Found: Not such consent by vmk dataset id: 123

e[2m2019-09-11 20:45:43.600e[0;39m e[33m WARNe[0;39m e[35m27e[0;39m e[2m---e[0;39m e[2m[nio-8070-exec-9]e[0;39m e[36me.i.k.k.v.validator.Validator e[0;39m e[2m:e[0;39m #/verifiableCredential/5/credentialSubject: required key [id] not found

Does this line of log happens after the logstash parser? if so, can you share the pipeline including the input, filter, output parts of the pipeline?

solved it with \x1B ESC

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.