Use the date filter to parse the timestamp field you're extracting from the log entry. That'll populate the @timestamp field that you can use in Kibana.
"Log Level", "Server version", "Java Class Name", "Thread" and "Message detail"
What you have almost works. Just make sure you extract the fields you're interested in in your grok expression instead of just matching the strings, i.e. use %{LOGLEVEL:Log Level} instead of %{LOGLEVEL}.
when I start my "sudo ./filebeat -e -c filebeat-logstash.yml -d "publish", I always got the following error.
anb041:filebeat-1.3.1 mmlug$ sudo ./filebeat -e -c filebeat-logstash.yml -d "publish"
2016/11/08 13:36:28.292774 geolite.go:24: INFO GeoIP disabled: No paths were set under output.geoip.paths
2016/11/08 13:36:28.294921 publish.go:269: INFO No outputs are defined. Please define one under the output section.
Error Initialising publisher: No outputs are defined. Please define one under the output section.
2016/11/08 13:36:28.294943 beat.go:161: CRIT No outputs are defined. Please define one under the output section.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.