Logs not showing up in readable format in Kibana after Logstash ingestion

Hi There,

I've a problem with cloudwatch logs showing up in readable format in kibana and I believe the problem is with setting Logstash charset. The error is:

[main] Received an event that has a different character encoding than you configured. {:text=>"\u0017\u0016\xBE\xAE\xED\"\xBE\xA5\xF0\u0004`B>\xDC١\xAFu\u000FV\xFC\xA3lz\u000F\xA6\xBF\xD1q\u001Fހ\u007F\u0006|\xF1\xF0\b\xAA\x95\xAF\a\xFB\n", :expected_charset=>"UTF-8"}

My setup is:
CWL -> Kinesis Firehose -> S3 -> Logstash -> EC2 ES Cluster -> Kibana

If I replace Logstash with Fluentd, the logs show up fine in Kibana so I know the setup is correct.

Any idea which charset I should be using in Logstash for my scenario?

Thanks in advance!
CK

OK i have figure this out. I configure s3 event notification to go to an aws sqs queue. then I install the logstash-input-s3-sns-sqs plugin. I use a Logstash pipe like this:

input
{
s3snssqs
{
region => "ap-southeast-1"
queue => "my-sqs-queue"
type => "sqs_logs"
codec => json { charset => "UTF-8" }
sqs_skip_delete => true
from_sns => false
s3_options_by_bucket => [
{
bucket_name => "my-s3-bucket"
prefix => "my-folder-in-s3/"
}
]
}
}

Hope this helps anyone having the same issue.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.