Config file won't stash a log

Hi:)
I'm very new to logstash and elasticsearch, I am trying to stash my first log to log stash in a way that I can (correct me if it is not the purpose) search it using elasticsearch....

I have a log that looks like this basically:

2016-12-18 10:16:55,404 - INFO - flowManager.py - loading metadata xml

So, I have created a config file test.conf that looks like this:

input {
  file {
    path => "/home/usr/tmp/logs/mylog.log"
    type => "test-type"
    id => "NEWTRY"
 }
}
filter {
  grok {
    match => { "message" => "%{YEAR:year}-%{MONTHNUM:month}-%{MONTHDAY:day} %{HOUR:hour}:%{MINUTE:minute}:%{SECOND:second} - %{LOGLEVEL:level} - %{WORD:scriptName}.%{WORD:scriptEND} - " }
  }
}
output {
  elasticsearch {
    hosts =>  ["localhost:9200"]
    index => "ecommerce"
    codec => line { format => "%{year}-%{month}-%{day} %{hour}:%{minute}:%{second} - %{level} - %{scriptName}.%{scriptEND} - \"%{message}\"" }
  }
}

And then : ./bin/logstash -f test.conf

I do not see the log in Elasticsearch when I go to: http://localhost:9200/ecommerce or to http://localhost:9200/ecommerce/test-type/NEWTRY

Please tell me what am I doing wrong.... :confused:

Thanks,
Heather

Add a stdout and see if there is anything in the output.
If not, then its sincedb :slight_smile:

Since your doing it from the CLI, in your directory it will write a .sincedb_XXXXXXX file

The File input remembers the last time it has read a file and it would be stored in that file.

if your testing you can set the option to the file in put called sincedb_path=>/dev/null but only do this for testing files. Or you can jut delete the .sincedb_XXXXXX files.

other then that I see no reason it not working

I tried to do both no change in results... I'm suppose to see it when I go to http://localhost:9200/ecommerce/ right? Or am I checking in the wrong place...?

So what do you see in the console then?

I see that it has run health check, and then a warning that connections to ES instance has restored...https://drive.google.com/open?id=0BwAfQjh0z4O8ZmJTSktpem5LS3c

When I use the debug level I see some fatal error related to the patterns adding like Adding pattern {"BACULA_LOG_MAXSTART"=>"Fatal error: Job canceled because max start delay time exceeded."} https://drive.google.com/open?id=0BwAfQjh0z4O8VmdRZnI1cXdXZmM .. But I have checked the patterns in the grok debugger and it seemed okay...

It may also be https://www.elastic.co/guide/en/logstash/5.1/plugins-inputs-file.html#plugins-inputs-file-start_position

OMG it has worked!! THANK YOU SO MUCH!! :grinning: :grinning: :grinning:
Do you know maybe why was it required...?

Because the default is end :wink:

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.