Hello,
I have set up two processes to collect data and present it to ELK.
On the one hand I have a process that collects log data from an iis and on the other I have another process that what it does is import the data from a daily cvs to a new index in Elasticsearch.
The files that I load in logstash are the following:
input {
beats {
port => 5044
}
}
filter {
# ignore log comments
if [message] =~ "^#" {
drop {}
}
# check that fields match your IIS log settings
grok {
match => { "message" => "%{TIMESTAMP_ISO8601:log_timestamp} %{WORD:s-sitename} %{NOTSPACE:s-computername} %{IPORHOST:s-ip} %{WORD:cs-method} %{URIPATH:cs-uri-stem} (?:-|\"%{URIPATH:cs-uri-query}\") %{NUMBER:s-port} %{NOTSPACE:cs-username} %{IPORHOST:c-ip} %{NOTSPACE:cs-version} %{NOTSPACE:cs-useragent} %{NOTSPACE:cs-cookie} %{NOTSPACE:cs-referer} %{NOTSPACE:cs-host} %{NUMBER:sc-status} %{NUMBER:sc-substatus} %{NUMBER:sc-win32-status} %{NUMBER:sc-bytes} %{NUMBER:cs-bytes} %{NUMBER:time-taken}" }
}
# set the event timestamp from the log
# https://www.elastic.co/guide/en/logstash/current/plugins-filters-date.html
date {
match => [ "log_timestamp", "YYYY-MM-dd HH:mm:ss" ]
timezone => "Etc/UCT"
}
# matches the big, long nasty useragent string to the actual browser name, version, etc
# https://www.elastic.co/guide/en/logstash/current/plugins-filters-useragent.html
useragent {
source=> "useragent"
prefix=> "browser_"
}
mutate {
remove_field => [ "log_timestamp"]
}
}
output {
elasticsearch {
hosts => ["elasticsearch_server:9200"]
index => "iis-log-%{+YYYY.MM.dd}"
}
}
input {
file {
path => "/etc/logstash/conf.d/nextcloud-report-usage/*.csv"
start_position => "beginning"
sincedb_path => "/dev/null"
}
}
filter {
csv {
separator => ","
columns => [
"user-id",
"date",
"last_login_date",
"assigned_quota",
"used_quota",
"number_files",
"number_shares",
"number_uploads",
"number_downloads"
]
separator => ","
remove_field => ["message"]
}
date {
match => [ "date", "yyyy-MM-dd HH:mm:ss"]
target => ["@timestamp"]
}
date {
match => [ "last_login_date", "yyyy-MM-dd HH:mm:ss"]
}
mutate {convert => ["assigned_quota", "integer"]}
mutate {convert => ["used_quota", "integer"]}
mutate {convert => ["number_files", "integer"]}
mutate {convert => ["number_shares", "integer"]}
mutate {convert => ["number_uploads", "integer"]}
mutate {convert => ["number_downloads", "integer"]}
}
output {
elasticsearch {
hosts => ["elasticsearch_server:9200"]
index => "nextcloud-metrics-%{+YYYY.MM.dd}"
}
}
How can I do to have these two processes started simultaneously? That is, the iis logs and the execution of a daily load of data from a csv file.
On the other hand, I have launched logstash to test the data load of the cvs file and see that everything is correct and that it generates the corresponding index correctly, and when I see the log I see the following error:
0390e14f3b27abd3423a1] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"nextcloud-metrics-2022.05.04", :routing=>nil}, {"tags"=>["_dateparsefailure"], "last_login_date"=>"2022-05-02T13:01:07+00:00", "number_uploads"=>0, "used_quota"=>22869957, "log"=>{"file"=>{"path"=>"/etc/logstash/conf.d/nextcloud-report-usage/usage_report_04052022_2.csv"}}, "date"=>"2022-05-02T13:17:40+00:00", "number_files"=>36, "assigned_quota"=>26843545600, "@version"=>"1", "number_downloads"=>6, "event"=>{"original"=>"apiuser,2022-05-02T13:17:40+00:00,2022-05-02T13:01:07+00:00,26843545600,22869957,36,0,0,6"}, "@timestamp"=>2022-05-04T07:35:18.689267Z, "user-id"=>"apiuser", "number_shares"=>0, "host"=>{"name"=>"tfvs0756.medusa.gobiernodecanarias.net"}}], :response=>{"index"=>{"_index"=>"nextcloud-metrics-2022.05.04", "_id"=>"VQP-jYABi-PU5vurrNZ5", "status"=>400, "error"=>{"type"=>"illegal_argument_exception", "reason"=>"mapper [last_login_date] cannot be changed from type [text] to [date]"}}}}
And in this process, I also see that it does not load all the data from the csv file. I would have to load 149 records, which are the ones in the file and it only loads 143, not seeing where the problem could be.
I hope you can help me as I am starting to use this tool.
Best regards.