Config Logstash for Syslog

Hi everyone,

I would like to know here is my mistake.

My config is :
Kibana 4.1.1
Elasticsearch 1.7.1
Logstash 1.5.4

I want to catch syslog in my elasticsearch. To do this i configure my logstash like :

input {
  tcp {
    port => 514
    type => syslog
  }

  udp {
    port => 514
    type => syslog
  }
}

filter {
  if [type] == "syslog" {
    grok {
      match => { "message" => "%{SYSLOGTIMESTAMP:syslog_timestamp} %{SYSLOGHOST:syslog_hostname} %{DATA:syslog_program}(?:\[%{POSINT:syslog_pid}\])?: %{GREEDYDATA:syslog_message}" }
      add_field => [ "received_at", "%{@timestamp}" ]
      add_field => [ "received_from", "%{host}" ]
    }
    syslog_pri { }
    date {
      match => [ "syslog_timestamp", "MMM  d HH:mm:ss", "MMM dd HH:mm:ss" ]
    }
  }
}

output {
    elasticsearch {
        hosts => ["MYSERVER:9200"]
        index => "syslog-%{+YYYY.MM.dd}"
    }
}

But when i go on Kibana i have :

My syslog come from switch, vcenter, and software who generate syslog. I have always the error.

Thank you for your help,

John

What's your question?

Why all messages from my syslog are : tags: _grokparsefailure ?

I would like to see the content of the syslog messages.

When you use the Grok debugger, does your Grok filter work?

I think you have more configuration than you're telling us. The filters you list above aren't used at all since the type of your example messages is "testsyslog" and not "syslog". How come, when the configuration you posted doesn't contain the word "testsyslog"? Which grok filter is adding the _grokparsefailure tag?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.