Read just the new logs, and stop generation syslog automaticly

Hi,

i am using the last versions of logstash, kibana and elasticsearch

In fact i have two problem, and i hope that you can help me

the first problem is that if i add an other log in access.log, the
logstash read all the other logs !!! is that a bug or what ??, and how i
specifie to logstash to read just the new log ??

This my code :

input {

file {
path => "/root/elastic/access.log"
type => "apache"
}
}

filter {
grok {
patterns_dir => "./patterns"
match => [ "message", "%{HTTPDATE:dating}" ]

}
date {
locale => "fr"
match => ["dating", "dd/MM/YYYY:HH:mm:ss:SSS"]
target => "@timestamp"
}
}

output {
stdout { codec => rubydebug}

elasticsearch {
protocol => "http"
host => "10.198.114.113"

}

}

The second problem is that the Kibana generate a syslog automaticly(@timestamp)!!
so how can i stop this generation and tell kibana to use just the logs specified in the file.

Thank you a lot

}

Problem 2 look at the "date" filter which you can use the source data instead of the indexing date

Problem 1 , based on your configuration it will only read in the access.log. If you had a * or directory name that might be a different story but you specify a file name so no other files should be read in

Hi,

thanks for your response.

problem 2 :
can you explain more the solution, I am a beginner with logstash. :slight_smile:

for the problem 1: it read only the access.log, but if i add another logs (line) it read also an other time the old logs in the same file.

thanks

problem 2: Probably should have given you the link for it, sorry
https://www.elastic.co/guide/en/logstash/current/plugins-filters-date.html

Can you provide an example config of problem 1

problem 1:
for example if i have this logs:
my file access.log :

14/06/2015:10:41:24:114
14/06/2015:10:42:24:114
14/06/2015:10:43:24:114
14/06/2015:10:44:24:114

Kibana will read this logs,

if i add a logs and i save, my file will be like that :
15/06/2015:10:41:24:114
14/06/2015:10:41:24:114
14/06/2015:10:42:24:114
14/06/2015:10:43:24:114
14/06/2015:10:44:24:114

so kibana read also the other old logs and i will have 9 logs instead of 5 !!

i can't resolve any problem !!!

Oussama,

Using the file input on a file and adding lines to that file using an editor will in fact produce what you described because when saving the file from the editor, a new inode is created and the file input treat it as a new file.

To test you should append lines to your file using, for example in Linux:
$ echo "foobar" >> /root/elastic/access.log

Now, for the timestamp parsing, do you have any _grokparsefailure or _dataparsefailure tags in your events?

Colin

Hi,

thank you very much, now i see why it read an other time.

for the timestamp parsing i don't have any _grokparsefailure or _dataparsefailure :

And in Kibana discover i can't stop the automatic generation for the @timestamp . see the picture

Thanks

Logstash will always produce a @timestamp field and Kibana relies on that field. Is this a problem for you?

Hi,

Yes that is my problem :slight_smile: , can i resolve it ?

As I said, Kibana relies on the presence of that field. Why is that a problem?

(Perhaps Kibana 4 allows the name of the timestamp field to be configured. I haven't checked.)

Kibana generate a logs automaticly, with type logs or syslog, what i want is to stop this generation or to not display it in the discover, i wanna to kibana to read just the logs from my file logs and display it

Kibana is a tool to visualize logs over time. For the "over time" part to be useful there must be a notion of time in the data, and the idea is to have Logstash extract that from the logs (and it happens to store it in the @timestamp field). If you just want to display the raw logs I'm not sure what use Kibana is to you.