Dear ELK/Grok Experts,
First of all, I'm new to ELK. I'm reading/processing 3 log files together from a single Logstash conf file (called "textlogs.conf", see below) and using custom Grok patterns I'm trying to index those log files into Elasticsearch so that I can visualize them in Kibana. The versions of ELK I'm using are respectively elasticsearch-1.6.0, logstash-1.5.2, and kibana-4.1.1-windows. But somehow I can't get logstash connect to elasticsearch, because the conf file doesn't index anything. I have checked each of my custom Grok patterns in grokdebug.herokuapp.com website, and they work perfectly. So there's no issue with the custom Grok patterns. However, I was using logstash-1.5.1 until last week, when I upgraded it to logstash-1.5.2, and I've been having this problem ever since. Out of frustration, I went so far as to delete the all my instances (all previous indices, including .kibana), deleted all ELK folders, and re-downloaded and unzipped ELK, rebooted my computer, and started from scratch. It's still showing no index except .kibana. I wonder whether there's any bug in the conf file (albeit the clean/perfectly defined Grok patterns), or it has to do with cluster/upgrade errors. I would appreciate if you have any solution/thoughts on this. Please see below for Conf file and screenshots of all my runs:
Thank you so much!
Regards,
Ahmad
file: "textlogs.conf"
input { file { type => "total_messages_per_server" path => "C:\Users\ahmadmar\Documents\ELK\VM_Work\Email_Dashboard\text-logs\report_1_total_messages_per_server.log" } file { type => "total_messages_per_sender_address" path => "C:\Users\ahmadmar\Documents\ELK\VM_Work\Email_Dashboard\text-logs\report_9_total_messages_per_sender_address_top10.log" } file { type => "distribution_sent_emails_general" path => "C:\Users\ahmadmar\Documents\ELK\VM_Work\Email_Dashboard\text-logs\report_distribution_sent_emails_general.log" } } filter { if [type] == "total_messages_per_server" { grok { match => { "message" => "%{DATA:server}\t%{NUMBER:total_messages_server}\t(?%{YEAR}-%{MONTHNUM}-%{MONTHDAY})" } } } if [type] == "total_messages_per_sender_address" { grok { match => { "message" => "(?[a-zA-Z0-9_.+-=:]+@[a-zA-Z0-9_.+-=:]+)\t%{NUMBER:total_messages}\t(?%{YEAR}-%{MONTHNUM}-%{MONTHDAY})" } } } if [type] == "distribution_sent_emails_general" { grok { match => { "message" => "(?%{WORD}|%{NUMBER})\t(?%{YEAR}-%{MONTHNUM}-%{MONTHDAY})"} } } } output { if ([type] == "total_messages_per_server" or [type] == "total_messages_per_sender_address" or [type] == "distribution_sent_emails_general") { elasticsearch { host => "localhost" index => "testlogs" } stdout { codec => rubydebug } } } ---------- ![|690x374](upload://5BU22xFqoG19KJpn6OdX5hpeOnC.PNG) ![|690x371](upload://w1u1CTGSCdOBff77p4Y24K5hDhl.PNG) ![|690x365](upload://oXXzFkH39QHnt0pKylAIfdp9jl5.PNG) I'm using Sense (Google Chrome Extension for Elasticsearch) for my curl requests. Here are the screenshots of the curl results: ![|690x233](upload://adZi4onfxEsd6ysFpFUg7RqKINm.PNG) ![|690x146](upload://mwQClqdIs3cPLwNtMvPcgfSXLmE.PNG) ![|690x136](upload://lv1z4LNqASvp82wZYnheott7dbR.PNG) ![|690x275](upload://yz6tzzkzbbAA1LABPhDUip5TWmY.PNG) ![|325x500](upload://aCjpCGIUdKHwiRJdqTFaAWz0Qez.PNG) ![|519x500](upload://2VSBjvQxdLg4acKm5R36iBYVv8J.PNG)