Logstash not stashing logs even after starting successfully


#1

Hi all,

I have an apache log file that I want to ingest into logstash and then send the stashed data to elasticsearch.

I run the logstash command like this:

.\logstash -f logtstash.conf

After running the command this is the output I get

Successfully started Logstash API endpoint is the last line. Ideally I should be able to see all the logs being uploaded to Elasticsearch. But, it isn't happening.

Here is my logstash.conf

input {
  file {
    path => "E:\elk\logstash\apache_logs"
    type => "apache_access"
    start_position => "beginning"
  }
}

filter {
   if [type] in [ "apache" , "apache_access" , "apache-access" ]  {
      grok {
         match => [
         "message" , "%{COMBINEDAPACHELOG}+%{GREEDYDATA:extra_fields}",
         "message" , "%{COMMONAPACHELOG}+%{GREEDYDATA:extra_fields}"
         ]
         overwrite => [ "message" ]
      }
      mutate {
         convert => ["response", "integer"]
         convert => ["bytes", "integer"]
         convert => ["responsetime", "float"]
      }
      geoip {
         source => "clientip"
         target => "geoip"
         add_tag => [ "apache-geoip" ]
      }
      date {
         match => [ "timestamp" , "dd/MMM/YYYY:HH:mm:ss Z" ]
         remove_field => [ "timestamp" ]
      }
      useragent {
         source => "agent"
      }
   }
   if [type] in ["apache_error","apache-error"] {
      grok {
         match => ["message", "\[%{WORD:dayname} %{WORD:month} %{DATA:day} %{DATA:hour}:%{DATA:minute}:%{DATA:second} %{YEAR:year}\] \[%{NOTSPACE:loglevel}\] (?:\[client %{IPORHOST:clientip}\] ){0,1}%{GREEDYDATA:message}"]
         overwrite => [ "message" ]
      }
      mutate
      {
         add_field =>
         {
            "time_stamp" => "%{day}/%{month}/%{year}:%{hour}:%{minute}:%{second}"
         }
      }
      date {
         match => ["time_stamp", "dd/MMM/YYYY:HH:mm:ss"]
         remove_field => [ "time_stamp","day","dayname","month","hour","minute","second","year"]
      }
   }
}

output {
  elasticsearch {
    hosts => ["localhost:9200"]
    index => "apache-%{+YYYY.MM.dd}"
    document_type => "system_logs"
  }
  stdout { codec => rubydebug }
} 

My elasticsearch indices look like this

I have attempted several times to execute the above command, but the logs never get stashed. What wrong am I doing? Can someone please help me out?


#2

That will read a file called apache_logs. Is that what you want, or do you want to read all the files in that directory?


#3

I just want to read apache_logs


#4

Should I alter the config file?


#5

Are additional lines being appended to "E:\elk\logstash\apache_logs"?


#6

Sorry, what do you mean by additional lines?


#7

The file input tails the log file. If no new lines are added then it does not stash anything.


#8

This is the first time I am inserting all the lines. So, ideally all the lines need to get stashed.


#9

You said previously that you had run the command several times. Once you have run it once it will not stash anything unless there are additional lines appended to the file. So if the first time you ran the command the configuration had an issue, it is possible a sincedb got created and the size of the file got recorded.


#10

Thanks! I understand what the problem may be


(system) #11

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.