Filebeat on windows generating data much different than it is ingesting

I configured filebeat to run on one of our Windows Server 2012 servers. It appears to be performing the ingest function on the log file I configured for it to use in the filebeat.yml file. It also appears to be parsing the log file line by line as expected (reading through the screen output when running filebeat as a process on the server). It is sending data to Logstash, which in turn is sending it to ElasticSearch. However, when I look at the data in Kibana, it is pretty much gibberish.

Here is the pertinent filebeat.yml section -

- type: log
  enabled: true
    - e:\Monitor\Filebeat\*
  exclude_lines: ['^Folder:']

Here is an excerpt from the log file:

Folder: \Applications\Web\Internet
Ping to Google-No DNS Host is alive 2 ms ping (timeout - 2000 ms)
Web - bing.com Host is alive 81 ms URL request
Web - Google.com Host is alive 82 ms URL request
Web - www.yahoo.com Host is alive 328 ms URL request

Here is a small snippet (one log entry) of what is showing on the screen when running filebeat as a process:

"message": "Ping to Google-No DNS\tHost is alive\t2 ms\tping (timeout - 2000 ms)\t",
"input": {
     "type": "log"
}

This is an actual entry from the log, so all appears well so far running filebeat.

However, here is what kibana is showing when viewing data from this index:

2W\u0000\u0000\u0000\e2C\u0000\u0000\u0006\x80x^\xE4[\xD9o۶\u001F\xA7\xF3;vo=v\xEFE\xE0S\aX\u0002E\xDD~j\u05EEˆ\xE6@\x9Cn\xC0 \xA0\xA0c&\u0011fI\x9E\xC4$\v\x8A\xFC\xAD{\u001FvbO{\e(J\xB2#\u001Fe#\xBAs\xBA\xB7\xC0\xB2%\xEA\xFB9\xBE\a\u0019\xFC\u0015\u0000\xA0\u0003@\xE7ǧ\xF0.\x8Bb\x9A3\u0012\x8Fa\u000Fb\x84\x91\x8E\\\u001D\xD9\xFB\b\xF5\x90\xDD3M\xC3u\x9Doa\u0017ލ)#C\xC2\b\xEC=\x85\u0003J\u0018\xEC\xC1\xC3hD\a\x940؅\xEC|La\u000F>\u0019\xA6\a\xB0\vOi\x96Gi\u0002{\xD03<\u0003\xC1\x8B.<Ns\xC6\u007F\x9A\x90\x98\u007Fqkg\xFB\xCB\xFD\x9D=\x93_\u001A\xA5G\xFCJzx\x98S\u0006{\xA8[ܘ\u007F4&\xEC\u0018\xF6 \xED\x85\xE1f\x9A\xB3\xAD4\x89X\x9A\x85\xE1\xC3\xF2\xC1\x93\xBF\xF4\xCD-\x83\xFD\xC0\xE0\xC5E\u0017\xC64\xCF\xC9\u0011\u007F\xCA>͙Ɵ\u0018\xB2>#\xEC$\u000F\xD9\u001E\u001D\x8F\xCECV\\\x89);N\x87!\x83]\u0018%\xE3\u0013\xC6\u001FY\xBE\b_\xD3E\u0017\x92#\x9A\u0014\u001F\xF3\xE5\xF3\eM/\xBD\v\xA3!\xEC\xC1\u0000\x99ȷ=W\xF7M/\xD0\xED\xA1\x8FtB\xA9\xA7\xDB\u000E!\x84\x9A\xB6c\u0012\xFE\xDDfL\xBAգ\xF8˖Q\xA4\xE3c\u001Aӌ\x8C\x9E\u0014w\xC6\xFE\xC0\xB7\u001D\xFF@\xB7\xADC\xA4ۇ\xC8\xD5\xC9!º3p\xDCA\xE0\x91Ap\xE8\xF2U҃\x9C/}\xF2\u0004\xD3px\xD4/

Here is a snippet of the output from wireshark running on the server hosting the log file (same system where filebeat is running). So it is effectively capturing the data as filebeat is sending it to the logstash server for processing. It looks like what kibana is showing, no discernible text from the log entries:
(standard packet window in wireshark - hex values on the left, ascii on the right)

f730   6c 77 63 5d 8c 6c 7b 2c 05 28 21 fe 67 d2 1e c7   lwc].l{,.(!.g...
f740   61 9e bc 7e f0 e5 f3 05 3f 06 f3 3a 5f 30 93 90   a..~....?..:_0..
f750   35 4d 2e 9a 8c 20 bb 74 6e a3 79 ea 21 0b 8b 3b   5M... .tn.y.!..;
f760   93 91 3d b3 7c 14 87 2c ac 5f be e0 4c 7e d1 19   ..=.|..,._..L~..
f770   62 08 ca ec 52 ef 3b aa b1 38 14 9a a6 ba 85 70   b...R.;..8.....p
f780   b7 ec fb 01 c8 63 dd 9a 5f d4 e7 8d da bc 5e 9d   .....c.._.....^.
f790   df 65 cd 93 13 d0 98 1f 18 cd 83 98 56 b8 b9 2b   .e..........V..+
f7a0   25 9d 97 61 28 27 cb fe 20 6d 41 e5 b3 fc 92 83   %..a('.. mA.....
f7b0   71 9d c9 8c 55 be 99 16 ee 30 84 cf a7 be 84 89   q...U....0......
f7c0   09 84 fb 31 04 bb 7b 25 a8 cd e4 a8 af 64 b4 e7   ...1..{%.....d..
f7d0   87 bc e4 0c 20 4e 6a 52 ca d8 40 00 a2 a1 20 d3   .... NjR..@... .
f7e0   aa 99 3c 0e 39 eb 0b 95 ba e0 77 b1 5a 35 37 5b   ..<.9.....w.Z57[
f7f0   5d b5 7b bc 15 ac 6d aa ad ae 7a f0 10 da 00 8c   ].{...m...z.....
f800   a4 2c be e3 37 45 5e 40 90 37 14 a5 f4 7a aa 54   .,..7E^@.7...z.T
f810   99 29 9a 11 aa c9 68 4d 8e 99 fe 68 c4 cc 61 9f   .)....hM...h..a.
f820   7c 01 92 75 97 e3 68 96 a1 0c cd ca b4 55 6c da   |..u..h......Ul.
f830   21 5e 84 d9 a6 df f6 b7 fc 8d 13 e8 dc 2b 6f 23   !^...........+o#
f840   b9 fc 9c 82 d1 74 73 05 e8 dd e4 8b 5e ec 7d 63   .....ts.....^.}c
f850   d4 67 1b 53 5f 15 46 eb d1 51 09 4b 5d 3c be ce   .g.S_.F..Q.K]<..
f860   f4 35 46 fe 22 99 91 48 88 66 73 14 fe 7c c1 1b   .5F."..H.fs..|..
f870   bf f9 35 7e e3 8c 24 b0 bf 00 f4 b6 f2 d1 de ec   ..5~..$.........
f880   08 9e b9 0c 08 43 32 56 9c 96 4c 65 81 a3 86 bf   .....C2V..Le....
f890   86 24 91 9e ae 60 89 72 c2 e7 be 81 50 95 d1 50   .$...`.r....P..P
f8a0   d3 a2 df 39 ee 74 d6 5b de 06 54 7f 86 9a c2 29   ...9.t.[..T....)
f8b0   aa 0c e8 bd 73 2e da cb ca c2 54 2b b4 07 b7 47   ....s.....T+...G
f8c0   2f 6b 42 ed 1d 54 b1 84 b6 88 e9 4d 3c e4 0c 50   /kB..T.....M<..P

Does anyone know why it appears filebeat is turning the log data into something other than text?

Thank you in advance for your assistance.

Solved,

In doing more research into this, it appears this 'gibberish' is actually the encoded protocol 'Lumberjack'. I was expecting to see the json formatted output string instead. The issue I was having is I was not properly decoding this stream in the logstash pipeline ingest configuration. I had the pipeline.yml file configured to ingest this filebeat info as a 'tcp' port listener. I should have had it configured to listen as a 'beats' listener.

I went from this:

input {
  tcp {
    port => 44444
    type => alltests
  }
}

To this:

input {
  beats {
    port => 44444
    type => alltests
  }
}

in the pipeline.yml file assigned to ingest this info, and now kibana is showing what you would expect.

Problem solved...

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.