HI Magnus, I saw your previous notes on this here: Extract timestamp from filename and I have tried to populate my date field with the date that is in the name of the log file like this but it is not working. Suggestions please?
File Name:
Cangenbus-10-17-19.csv
My Mapping:
PUT cangenbus-v5
{
"mappings": {
"doc": {
"properties": {
"Name": { "type": "text" },
"Number": { "type": "integer","ignore_malformed": true},
"Path": { "type": "text" }
}
}
}
}
====================================
{
"acknowledged": true,
"shards_acknowledged": true,
"index": "cangenbus-v5"
}
=======================================
My Configuration File for Logstash:
input {
file {
path => "/opt/sample-data/cangenbus-csv/*.csv"
start_position => "beginning"
sincedb_path => "/dev/null"
}
}
filter {
csv {
separator => ","
columns => ["Name","Number"]
}
grok {
match => ["filename", "(?[%{YEAR}%{MONTHNUM}%{MONTHDAY}])"]
}
date {
target => "filetimestamp"
}
output {
elasticsearch {
hosts => "http://10.0.2.15:9200"
index => "cangenbus-v5"
}
stdout {}
}
========================
Error:
[2017-12-01T14:24:54,019][WARN ][logstash.filters.csv ] Error parsing csv {:field=>"message", :source=>"", :exception=>#<NoMethodError: undefined method `each_index' for nil:NilClass>}
Error:
[FATAL] 2017-12-01 15:25:30.752 [LogStash::Runner] runner - The given configuration is invalid. Reason: Expected one of #, => at line 22, column 18 (byte 377) after filter {
csv {
separator => ","
columns => ["Name","Number"]
}
grok {
match => ["filename", "(?[%{YEAR-}%{MONTHNUM-}%{MONTHDAY-}])"]
}
date {
target => "filetimestamp"
}
output {
elasticsearch
root@ubuntu-16:/etc/logstash/conf.d#