Hello!
I know that I'm trying to do something simple, and I'm just getting started with Elasticsearch/Logstash, but I just can't seem to get the right combination, so I was hoping that someone might point me in the correct direction.
I am trying to import a bunch of old syslog messages into Elasticsearch using Logstash since I already have a Logstash configuration file which works for current syslog messages. The problem is that my prior syslog files require that the date be parsed and used for the timestamp. My problem is that I don't know how/where to put the date conversion code in my configuration file.
The logs that I want to ingest look like this:
2020-02-19 23:59:59 Local7.Notice 172.25.0.1 date=2020-02-19 time=23:59:59 devname="firewall" ...
Here is my Logstash configuration file:
input {
file {
path => "/home/templogs/parse.txt"
type => "forti_log"
start_position => "beginning"
ignore_older => 3000000
}
}
filter {
if [type] == "forti_log" {
kv {
source => "message"
exclude_keys => [ "type", "subtype" ] }
geoip { source => "dst" }
geoip { source => "dstip" }
geoip { source => "src" }
geoip { source => "srcip" }
mutate {
rename => [ "dst", "dst_ip" ]
rename => [ "dstip", "dst_ip" ]
rename => [ "dstport", "dst_port" ]
rename => [ "devname", "device_id" ]
rename => [ "status", "action" ]
rename => [ "src", "src_ip" ]
rename => [ "srcip", "src_ip" ]
rename => [ "zone", "src_intf" ]
rename => [ "srcintf", "src_intf" ]
rename => [ "srcport", "src_port" ]
rename => [ "rcvd", "byte_recieved" ]
rename => [ "rcvdbyte", "bytes_recieved" ]
rename => [ "sentbyte", "bytes_sent" ]
rename => [ "sent", "bytes_sent" ]
convert => ["bytes_recieved", "integer"]
convert => ["bytes_sent", "integer"]
remove_field => [ "msg" ]
}
}
}
output {
if [type] == "forti_log" {
stdout { codec => rubydebug }
elasticsearch {
hosts => "localhost:9200"
index => "forti-%{+YYYY.MM.dd}"
}
}
}
Can someone please show me where I should put the date match commands?
And far less importantly, it seems Logstash only reads newly written lines in my parse.txt file. Of course I can fake it out with "cat syslog >> parse.txt" after Logstash is running, but I wondered if there is a way to just have Logstash read through the file from top to bottom and then exit when the input file has been parsed.
Thanks for any offered help!