Data sent by Filebeat looks garbled

I recently set up Filebeat on a Windows 2008 (R1) machine, to send my Exchange Server Message Tracking Logs to ELK. These are CSV files, which are split by logstash. However, the data only shows garbled lines, like this:

  "_index": "logstash_exch-2015.12.20",
  "_type": "ex_msg_trk",
  "_id": "AVG_8dcYVabyR4iBti7U",
  "_score": null,
  "_source": {
"message": "\\xFD%\\x81\\u007F\\u0005\\u0000\\u0000\\xFF\\xFFò\\xF0r",
"@version": "1",
"@timestamp": "2015-12-20T15:11:15.440Z",
"host": "192.168.79.46",
"port": 24020,
"type": "ex_msg_trk"

The message looks very garbled. And this is only a short one, most of the time it's about 10 lines of this stuff.

I already tried to change the encoding type, but no success. However, if I change the output in Filebeat to file instead of logstash, everything looks fine.

Does this sound familiair to anyone?

That looks like the wrong encoding. You can change it from the encoding option. For windows, you most likely need utf-16be.

Hmmm, I already thought it had to be something related to encoding. I assumed encoding was "plain" since it was just a text file, but I'm probably wrong.

I tried utf-16be, but no luck. But I found out I also have to change the input encoding in logstash, which I didn't do before. This was appearing in the logstash log:

message=>"Received an event that has a different character encoding than you configured."

I got some better results, but still not readable logs. But at least I know in which direction I have to look. Thanks!

Edit: I'm a little confused also. If I open the log files in Windows (notepad), and want to save it again, it says "Encoding: UTF-8". But if I configure that in Filebeat and Logstash, I still get garbled output.... :confused:

Edit2: I think I have to configure the encoding in Filebeat OR in Logstash. Not in both. But I'm not sure though...

on windows it should be utf-16le, if utf16 is used. After changing the encoding between utf8 or 16 you might have to delete the registry, in case file offsets are wrong.

Is your file mostly ASCII-content (no international characters)? Can you check with some hexdump if first bytes in file are really text content?

I tried the utf-16le encoding, but I still don't get anything useable. Also deleted the registry file after changing the encoding. What I notices is that the registry file is not being created when filebeat is started as a service, but it is created when starting from ppowershell prompt.

This is what a log line looks like on the Exchange Server:

2015-12-13T00:58:11.100Z,209.85.213.178,mail-ig0-f178.isp.com,192.168.79.46,VDMMAIL04,08D303422AFE7F83;2015-12-13T00:58:03.317Z;0,VDMMAIL04\Default internal receive connector VDMMAIL04,SMTP,RECEIVE,103548,<CAHVruR2wfV632bT1t7rCe3JR5O9Mva29oj7PJz7_P9Qymqg=yw@mail.isp.com>,rene@domain.net,,4990,1,,,Re: Testje 23:52,rene@isp.com,rene@isp.com,00A: NTS: ,Incoming,,209.85.213.178,192.168.79.46,

(I changed the e-mail addresses so I won't be bombed by spam).

This is what my input in logstash looks like:

input {
   tcp {
type => "ex_msg_trk"
port => 10516
  }
}

I also tried to change tcp input to beats input, but then I didn't receive anything anymore.

This is really driving me crazy. I almost considering to change to nxlog, but I really like to use the elastic products. This should work, right?

I send you a PM.

input plugin 'tcp' should not work. The beats plugin is the right one. Can you share your config and versions you've installed? What's the beats plugin version?

Guys, did you solve this issue ? I have the same and still trying to figure out how to solve this.

@Vladimir_Dimov can you share some details about logstash and filebeat config?

Yes, this was solved by using the beats input instead of tcp or udp input and using no filters in logstash at all.

Hope this helps.

Yes, i've solved the issue the same way, by changing the input plugin. Thanks guys!

Hope this topic will help other filebeat newbies :slight_smile: