Im very new to this, but i dont understand why I cant load my .conf file in /usr/share/logstash/bin.. using ./logstash -f file.conf
getting a message saying: "Sending Logstash's logs to /var/log/logstash which is now configured via log4j2.properties"
but i cant see my index in Kibana after....
Its on a fresh ubuntu 17.10 with a newly installed ELK stack.
Will someone be so kind and explain to me what this means?
How do I make it work?
filter {
if [type] == "CISCO_ASA_FIREWALL" {
grok {
match => {"message" => "%{SYSLOGTIMESTAMP:Cisco_Time} %{SYSLOGHOST:Cisco_Host} %{NOTSPACE:LogID} [\s*%{DATA:drop_type}\s*] drop %{DATA:drop_rate_id} exceeded. Current burst rate is %{INT:drop_rate_current_burst} per second, max configured rate is %{INT:drop_rate_max_burst}; Current average rate is %{INT:drop_rate_current_avg} per second, max configured rate is %{INT:drop_rate_max_avg}; Cumulative total count is %{INT:drop_total_count}"}
}
}
}
As I explained in another thread yesterday, start_position => "beginning" only matters for new and previously unseen files. Logstash is probably tailing the input file.
but how should Logstash have seens theese files before when its a fresh install..?! -
I dont understand why Logstash don't read from the beginning of the file - seems odd.??
In any case.. what do I have to do, in order to fix this issue?
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.