How to process old logs?

Just starting out with ELK.
I need to take old syslog files and process them in ELK so I will be able to see the messages in the date of the events, i.e, the timestamp that is shown in the log will be the actual time for the event in Elasticsearch.

My first attempt ended up with the entire line as the message and also with the date of the event as now.
The message is:
Oct 13 06:27:54 ip-10-2-7-128 dhclient[914]: DHCPACK of from

My logstash configuration is:
input {
file {
path => "/var/log/backup/syslog.5"
type => "syslog"
start_position => "beginning"
ilter {
if [fileset][module] == "system" {
if [fileset][name] == "syslog.5" {
grok {
match => { "message" => ["%{SYSLOGTIMESTAMP:[system][syslog][timestamp]} %{SYSLOGHOST:[system][syslog][hostname]} %{DATA:[system][syslog][program]}(?:[%{POSINT:[system][syslog][pid]}])?: %{GREEDYMULTILINE:[system][syslog][message]}"] }
pattern_definitions => { "GREEDYMULTILINE" => "(.|\n)*" }
remove_field => "message"
date {
match => [ "timestamp", "MMM dd HH:mm:ss" ]
output {
elasticsearch {
hosts => ["ES:9021"]
user => "user"
password => "passw0rd"
manage_template => false
index => "logstash--%{+YYYY.MM.dd}"

So what I need to change in order to get the logs and events shown in their exact time and not from now?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.