Sending Logstash's logs to /var/log/logstash but not to elasticsearch

Hi guys..

Im very new to this, but i dont understand why I cant load my .conf file in /usr/share/logstash/bin.. using ./logstash -f file.conf
getting a message saying: "Sending Logstash's logs to /var/log/logstash which is now configured via"
but i cant see my index in Kibana after....

Its on a fresh ubuntu 17.10 with a newly installed ELK stack.

Will someone be so kind and explain to me what this means?
How do I make it work?

And what's in your config file?

witch one of them do you need? Logstash's yaml file or the .conf i have been working on?

The .conf file.

input {
file {
exclude => ".gz"
path => "/home/lasse/Skrivebord/51asa5525/messages-2015-10-30
start_position => "beginning"

filter {
if [type] == "CISCO_ASA_FIREWALL" {
grok {
match => {"message" => "%{SYSLOGTIMESTAMP:Cisco_Time} %{SYSLOGHOST:Cisco_Host} %{NOTSPACE:LogID} [\s*%{DATA:drop_type}\s*] drop %{DATA:drop_rate_id} exceeded. Current burst rate is %{INT:drop_rate_current_burst} per second, max configured rate is %{INT:drop_rate_max_burst}; Current average rate is %{INT:drop_rate_current_avg} per second, max configured rate is %{INT:drop_rate_max_avg}; Cumulative total count is %{INT:drop_total_count}"}

output {
stdout { codec => plain }
elasticsearch { hosts => ["http://localhost:9200"]
document_type => "text"
index => "apples"

As I explained in another thread yesterday, start_position => "beginning" only matters for new and previously unseen files. Logstash is probably tailing the input file.

but how should Logstash have seens theese files before when its a fresh install..?! -
I dont understand why Logstash don't read from the beginning of the file - seems odd.??

In any case.. what do I have to do, in order to fix this issue?

thanks is advance!

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.