Synchronized data from mysql to Elasticsearch After the Chinese data is garbled Configured in the logstash configuration file UTF-8 GBK, etc. Not OK Someone knows how to solve it?
Remarks Database encoding is UTF-8
Synchronized data from mysql to Elasticsearch After the Chinese data is garbled Configured in the logstash configuration file UTF-8 GBK, etc. Not OK Someone knows how to solve it?
Remarks Database encoding is UTF-8
What did your pipeline configuration look like? Logstash favors UTF-8 wherever possible, and most codecs and inputs allow explicit configuration of charsets (e.g., the per-column charset configuration in the JDBC Input plugin).
When using a database like MySQL, there are typically several bits than can get screwy:
Can you share the schema?
Thank you for reminding me that the problem has been solved. I ignored the coding of the column. After reading your reply to me, I repeatedly thought about it. Suddenly I remembered the listed problems and the problem was solved. Thank you very much.
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.
© 2020. All Rights Reserved - Elasticsearch
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant logo are trademarks of the Apache Software Foundation in the United States and/or other countries.