Going crazy. Been busy / searching all day, but no success.
I've got this data which I want to import in Elasticsearch with Logstash:
[code]1465309033156,84,http://test.com/purchase.php,200,OK,ThreadGroup 1-3,true,6,6,84,testPC,0
1465309033469,176,http://test.com/,200,OK,ThreadGroup 1-7,true,7,7,176,testPC,91
1465309034108,210,http://test.com/,200,OK,ThreadGroup 1-4,true,7,7,210,testPC,82
1465309054246,85,http://test.com/reserve.php,200,OK,ThreadGroup 1-10,true,11,11,85,testPC,0
1465309054611,90,http://test.com/purchase.php,200,OK,ThreadGroup 1-4,true,11,11,89,testPC,0
1465309055131,91,http://test.com/purchase.php,200,OK,ThreadGroup 1-2,true,11,11,91,testPC,0
[/code]
This is my filter in Logstash:
[code]
filter {
csv {
columns => ["@jmeter_timestamp", "elapsed", "url", "responseCode", "responseMessage",
"threadName", "success", "grpThreads", "allThreads", "Latency", "Hostname", "Connect"]
separator => ","
}
date {
locale => "en"
match => [ "@jmeter_timestamp", "UNIX_MS" ]
remove_field => [ "@jmeter_timestamp" ]
target => "@timestamp"
timezone => "Europe/Amsterdam"
}
} [/code]
Part of my json file:
"mappings": {
"logs": {
"properties": {
"@timestamp": {
"type": "date"
},
But the errors message states:
"error"=>{"type"=>"mapper_parsing_exception",
"reason"=>"failed to parse [@timestamp]",
"caused_by"=>{"type"=>"illegal_argument_exception",
"reason"=>"Invalid format: \"2016-06-07T14:17:34.611Z\" is malformed at \"-06-07T14:17:34.611Z\""}}}}, :level=>:warn}
I thought I stated the match in my filter / date correctly?
The stdout looks like this: http://imgur.com/eWv3Vdw
And in Kopf I see an indice added.