I have imported data into Elastic using the CSV plugin. The first two fields are a timestamp which I would like to perform analysis on in Elastic and Kibana. The issue I have is the the two time stamp fields are being imported as strings ( this was expected from available documentation). I have tried to use the date function in Logstash as well as trying a mapping template in order to obtain a date field and not a string. I am OK with the @timestamp reflecting the import time as I just want to use the createdTimeStamp and detectedTimeStamp for searching and analysis.
Here are my relevant files.
The createTimeStamp and detectTimeStamp data from the file:
6/1/17 05:31|5/31/17 08:38|
Logstash.conf
input {
file {
path => [ "/Users/schroew/Documents/ELK/resJune.psv" ]
type => "ResilientExport"
tags => "ResilientTickets"
}
} #close input
filter {
if [path] == "/Users/schroew/Documents/ELK/resJune.psv" {
csv {
autodetect_column_names => "true"
separator => "|"
}
date {
match => [ "createTimeStamp", "m/d/y H:M" ]
target => "createTimeStamp"
}
date {
match => [ "detectTimeStamp", "m/d/y H:M" ]
target => "detectTimeStamp"
}
mutate { convert => {"totalUsersImpacted" => "integer"} }
mutate { remove_field => ["message"] }
} #close if
} # close filter
output {
elasticsearch {
hosts => ["127.0.0.1:9200"]
index => "resilient"
}
} #close output
My attempt at mapping-
PUT _template/resilienttemplate_1
{
"template": "res*",
"settings": {
"number_of_shards": 1
},
"mappings": {
"type1": {
"_source": {
"enabled": false
},
"properties": {
"createTimeStamp": {
"type": "date",
"format": "mm/dd/yy HH:MM"
},
"detectTimeStamp": {
"type": "date",
"format": "mm/dd/yy HH:MM"
}
}
}
}
}