Grok pattern for Logstash Logs

I am writing the configuration file in Logstash to shove logs from Logstash into Elasticsearch (so that I get to see them in Kibana eventually).

A typical Logstash Log file looks like this:

{:timestamp=>"2016-02-17T14:59:36.052000-0600", :message=>"Pipeline started", :level=>:info}
{:timestamp=>"2016-02-17T14:59:36.980000-0600", :message=>"Automatic template management enabled", :manage_template=>"true", :level=>:info}

I want to capture the "level => info" as a field, i.e., field name should be "level" and field value should be "info"

I wrote the following:

filter {

if [ type ] == "logstash_indexer" {

    mutate {
    replace => [ "message", "%{message}}" ]
    gsub => [ 'message', '\n', '']

}

    if [ message ] =~ /^{.*}$/

{
json { source => message }
}
if [ message] =~ /info/
{
mutate {
add_field => [ "level" , "info" ]
}

}

}
}

But I don't see what I want happening. Could any of you guide me through getting level=>info recognized? Where am I going wrong?

Thanks !

Excluding mutate part, are you able to send logs to Elasticsearch?

Even with the mutate part, I am able to send logs, but I can't see the field "level" with value "info" in Kibana. How can I edit so that I get the field and value on Kibana?

Use a grok filter. The json filter isn't useful since the log isn't in JSON format.