Sorry if these questions have been answered before, I am new to elastic and cannot find the answers
I have an entry in my docker logs that looks like this
Wed, 28 Aug 2019 15:40:23 GMT - info: Prematch events sync process started
When filebeat ships it elastic it ends up like this
Wed, 28 Aug 2019 15:40:23 GMT - e[32minfoe[39m: Prematch events sync process started
Firstly can I get rid of the "e[32m" and "e[39m"?
Secondly, I get rid of the timestamp as there is already a field for this?
Thirdly I am seeing these errors. How can I stop them?
ERROR readjson/json.go:52 Error decoding JSON: invalid character 'a' looking for beginning of value
ERROR readjson/json.go:52 Error decoding JSON: invalid character 'a' looking for beginning of value
ERROR readjson/json.go:52 Error decoding JSON: invalid character 'W' looking for beginning of value
ERROR readjson/json.go:52 Error decoding JSON: invalid character ':' looking for beginning of value
ERROR readjson/json.go:52 Error decoding JSON: json: cannot unmarshal number into Go value of type map[stri
ERROR readjson/json.go:52 Error decoding JSON: invalid character ':' looking for beginning of value
ERROR readjson/json.go:52 Error decoding JSON: invalid character 'c' looking for beginning of value
ERROR readjson/json.go:52 Error decoding JSON: json: cannot unmarshal number into Go value of type map[stri
ERROR readjson/json.go:52 Error decoding JSON: invalid character ':' looking for beginning of value
ERROR readjson/json.go:52 Error decoding JSON: invalid character ':' looking for beginning of value
They look like ANSI escape sequences that are used to define colors, is it possible that these logs are colored?
You can probably configure your application to don't log colored output, this is usually not useful in log files.
If not, filebeat doesn't have any feature to remove parts of the logs, you may need to use an ingest pipeline and the gsub processor.
You can define an ingest pipeline to parse your logs, there you could separate the date and the rest of the log message, and use the date in the logs as timestamp.
Before starting to implement your own pipeline, take a look to the existing Filebeat modules, to see if there is already a module for the service generating these logs. Filebeat modules include predefined pipelines for well-known services.
It looks like Filebeat is trying to parse as JSON something that is not a JSON document. Configuration would help to diagnose this, I see you have pasted it, but it seems incorrectly formatted. Could you paste it again as preformatted text? There is a button in the toolbar with an icon like </> that can help with that.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.