Logs are not getting parsed?

Hi ,

I have log file containing logs in format - - [16/Dec/2016:17:38:39 +0530] "GET /feed/user/785400761?q.l.m=en&debug=enabled HTTP/1.1" 200 - "-" "Jersey/2.25 (HttpUrlConnection 1.8.0_111)" 1041

But In elastic search this line is not getting parsed and saved as single line with key "message ".

How can I pre process this log line with fields and values in file beat .

  "_index": "filebeat-2016.12.18",
  "_type": "apache",
  "_id": "AVkOPcVbK0-PQC85yaJh",
  "_score": null,
  "_source": {
    "@timestamp": "2016-12-17T19:24:06.361Z",
    "beat": {
      "hostname": "VER-BLR-LT1599",
      "name": "VER-BLR-LT1599",
      "version": "5.1.1"
    "input_type": "log",
    "message": " - - [12/Sep/2016:16:21:15 +0000] \"GET /favicon.ico HTTP/1.1\" 200 3638 \"-\" \"Mozilla/5.0 (Macintosh; Intel Mac OS X 10_11_6) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/52.0.2743.116 Safari/537.36\"",
    "offset": 212,
    "source": "/home/sohanvir/Desktop/code-env/web/prod/test/ex.log",
    "type": "apache"
  "fields": {
    "@timestamp": [
  "sort": [

Thanks In Advance


You have to use Grok, that is, you can use Logstash in order to parse this message in different fields. With Filebeat, you can send logs to Logstash directly or Redis.

You can take a look in the documentation about Grok: Link.

Moreover, in your case, it is very simple, it seems an Apache log, in the filter section you could write:

filter {
  grok {
    match => { "message" => "%{COMMONAPACHELOG}" }


1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.