Hi:)
I'm very new to logstash and elasticsearch, I am trying to stash my first log to log stash in a way that I can (correct me if it is not the purpose) search it using elasticsearch....
I have a log that looks like this basically:
2016-12-18 10:16:55,404 - INFO - flowManager.py - loading metadata xml
So, I have created a config file test.conf that looks like this:
Since your doing it from the CLI, in your directory it will write a .sincedb_XXXXXXX file
The File input remembers the last time it has read a file and it would be stored in that file.
if your testing you can set the option to the file in put called sincedb_path=>/dev/null but only do this for testing files. Or you can jut delete the .sincedb_XXXXXX files.
I tried to do both no change in results... I'm suppose to see it when I go to http://localhost:9200/ecommerce/ right? Or am I checking in the wrong place...?
When I use the debug level I see some fatal error related to the patterns adding like Adding pattern {"BACULA_LOG_MAXSTART"=>"Fatal error: Job canceled because max start delay time exceeded."}https://drive.google.com/open?id=0BwAfQjh0z4O8VmdRZnI1cXdXZmM .. But I have checked the patterns in the grok debugger and it seemed okay...
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.