Splitting message into fields to create chart based on those fields

I want to be able to split log "message" into three fields (log_time, log_level, log_data) so that I can create Pie Chart based on those fields - see image below. How do I do it?

Current Logstash config:

input {
    beats {
        port => 5044
    }
}

filter {
    grok {
        match => { "message" => "\[%{TIMESTAMP_ISO8601:timestamp}\] %{DATA}%{LOGLEVEL:level}: %{GREEDYDATA:msg}" }
    }
    
    date {
        match => [ "timestamp" , "yyyy-MM-dd HH:mm:ss" ]
    }
}

output {
    elasticsearch {
        hosts => ["localhost:9200"]
        sniffing => true
        manage_template => false
        index => "web-symfony-app"
    }
}

My log pattern looks like below:

[2017-03-12 10:44:19] security.INFO: Populated the TokenStorage with an anonymous Token. [] []
[2017-03-10 16:45:50] pheanstalk.INFO: Watch {"payload":{"tube":"user_create_test"},"pheanstalk":"primary"} []

So if we explain it, it has these three sections: [TIMESTAMP_ISO8601] LOGLEVEL: GREEDYDATA

Exempla Logstash record looks like below:

{
  "_index": "web-symfony-app",
  "_type": "symfony-app",
  "_id": "AVz_XPdtUo6EnAM7FuTM",
  "_score": null,
  "_source": {
    "message": "2017-03-10 16:45:50] pheanstalk.INFO: Watch {\"payload\":{\"tube\":\"user_create_test\"},\"pheanstalk\":\"primary\"} []",
    "@version": "1",
    "@timestamp": "2017-07-07T14:36:34.258Z",
    "type": "symfony-app",
    "input_type": "log",
    "count": 1,
    "beat": {
      "hostname": "web",
      "name": "web"
    },
    "offset": 529329,
    "fields": null,
    "source": "/var/log/symfony/app.log",
    "host": "web",
    "tags": [
      "beats_input_codec_plain_applied",
      "_grokparsefailure"
    ],
    "index_name": "web-symfony-app"
  },
  "fields": {
    "@timestamp": [
      1499438194258
    ]
  },
  "sort": [
    1499438194258
  ]
}

Your grok expression isn't working because the actual log line doesn't begin with [.

It is just my copy+paste typo.

Well, either way the problem is that your grok expression isn't working. Simplify it as much as you can (start with ^\[%{TIMESTAMP_ISO8601:timestamp}\]) and try again. If that works, continue building the expression to gradually match more and more of the string.

Thanks for pointing that out so I'll deal with it later on. Do have an answer to my original question or any link to an example?

The grok failure is the reason you're not getting the fields you're asking for, so you should address it now rather than later.

You are absolutely right! I didn't get what you exactly you meant at the beginning but I perfectly get you now. Thanks for enlightening me. I never knew/noticed that the grok regex is the one which creates the fields in elasticsearch index.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.