Mutate not converting string to number

Hi,

I'm trying to put the .csv output of a JMeter test run into ELK for dashboarding etc. However, when the file is pushed through logstash it sets all the fields to be strings when I want things like 'elapsed' or 'sample count' to be numbers.

So I checked around and the filter plugins 'csv' and 'mutate' seem to be the ones I want to use to push the fields up into ELK as integers, floats or other numbers.

However, when I try to do this it always ends up pushing them in as strings again. Could someone give me a bit of guidance on what I'm doing wrong here?

My logstash.conf file looks like

input {
	stdin{}
	}
filter {
	if ([message] =~ "responseCode") {
			drop{ }
	} else {
		csv {
			columns => ["timeStamp","elapsed","label","responseCode","responseMessage","threadName","
			dataType","success","bytes","grpThreads","allThreads","URL","Latency","SampleCount","
			ErrorCount","Hostname","IdleTime"]
		}
		mutate{
				convert => { "elapsed" => "integer"}
			}
	}
}
output {
	#stdout { codec => rubydebug }
	elasticsearch {
		hosts => "localhost:9200"
        #port => "443"       # set to 80 if you want to use HTTP and not HTTPS
        ssl => "false"       # set to false if you don't want to use SSL/HTTPS
        index => "jmeter-results"
        manage_template => false
	}
}

And the .csv file I am trying to read from looks like

timeStamp,elapsed,label,responseCode,responseMessage,threadName,dataType,success,bytes,grpThreads,allThreads,URL,Latency,SampleCount,ErrorCount,Hostname,IdleTime
    2016/04/20 10:43:24.980,495,Navigate to Web Client: 001 Path,200,OK,TC00000 - Setup 1-1,text,true,5259,1,1,https://10.157.38.112:8443/programnameWebApp/,495,1,0,ip-10-157-38-80,0
    2016/04/20 10:43:25.489,12,Navigate to Web Client: 002 resources-programname.js,200,OK,TC00000 - Setup 1-1,text,true,6789,1,1,https://10.157.38.112:8443/programnameWebApp/resources/js/api/programname.js,12,1,0,ip-10-157-38-80,0
    2016/04/20 10:43:25.507,13,Navigate to Web Client: 003 resources-atmosphere-2.2.1.js,200,OK,TC00000 - Setup 1-1,text,true,131509,1,1,https://10.157.38.112:8443/programnameWebApp/resources/js/atmosphere-2.2.1.js,5,1,0,ip-10-157-38-80,0
    2016/04/20 10:43:25.527,4,Navigate to Web Client: 004 nocache.js,200,OK,TC00000 - Setup 1-1,text,true,9792,1,1,https://10.157.38.112:8443/programnameWebApp/programnameWeb/programnameWeb.nocache.js,4,1,0,ip-10-157-38-80,0
    2016/04/20 10:43:25.565,5,Navigate to Web Client: 005 resources-detect_timezone.js,200,OK,TC00000 - Setup 1-1,text,true,13956,1,1,https://10.157.38.112:8443/programnameWebApp/resources/js/detect_timezone.js,5,1,0,ip-10-157-38-80,0

Sorry, I don't see any error in your configuration.
It should just work...

Hmm, disappointing, I was hoping there was some obvious syntax error in there I was just too blind to see.

I'll try the extractnumbers plugin, see if that gets me anywhere.

I personnaly use this plugin and it works fine.

The problem is maybe elsewhere.
For example, if anywhere else you don't convert elapsed type, and the first document stored in elastic search makes elastic search consider this field as string type.

I deleted the index that initially loaded this as a string. Is this data type stored elsewhere?

hey i worked on your .csv file as taking as a input file and everything is working fine..

This is the output im getting , as you can see elapsed is converted into integer , elapsed field value is not quoated ..

"message" => " 2016/04/20 10:43:25.565,5,Navigate to Web Client: 005 resources-detect_timezone.js,200,OK,TC00000 - Setup 1-1,text,true,13956,1,1,https://10.157.38.112:8443/programnameWebApp/resources/js/detect_timezone.js,5,1,0,ip-10-157-38-80,0",
"@version" => "1",
"@timestamp" => "2016-05-06T11:42:13.519Z",
"path" => "/opt/logstash/logs/basic.csv",
"host" => "localhost.localdomain",
"type" => "log",
"timeStamp" => " 2016/04/20 10:43:25.565",
"elapsed" => 5,
"label" => "Navigate to Web Client: 005 resources-detect_timezone.js",
"responseCode" => "200",
"responseMessage" => "OK",
"threadName" => "TC00000 - Setup 1-1",
"\n\t\t\tdataType" => "text",
"success" => "true",
"bytes" => "13956",
"grpThreads" => "1",
"allThreads" => "1",
"URL" => "https://10.157.38.112:8443/programnameWebApp/resources/js/detect_timezone.js",
"Latency" => "5",
"SampleCount" => "1",
"\n\t\t\tErrorCount" => "0",
"Hostname" => "ip-10-157-38-80",
"IdleTime" => "0"

Here is my conf file

input {
file
{
path => "/opt/logstash/logs/basic.csv"
type => "log"

        start_position => "beginning"

}
}
filter {
if ([message] =~ "responseCode") {
drop{ }
} else {
csv {
columns => ["timeStamp","elapsed","label","responseCode","responseMessage","threadName","
dataType","success","bytes","grpThreads","allThreads","URL","Latency","SampleCount","
ErrorCount","Hostname","IdleTime"]
}
mutate{
convert => { "elapsed" => "integer"}
}
}
}
output {
stdout { codec => rubydebug }}

elasticsearch {

hosts => "localhost:9200"

    #port => "443"       # set to 80 if you want to use HTTP and not HTTPS

ssl => "false" # set to false if you don't want to use SSL/HTTPS

index => "jmeter-results"

manage_template => false

Right, so if it's working for you then maybe the problem is with my ELK stack, I'll try cleaning it up and maybe do a complete reinstall, it's a dev box right now anyway.

Hi,

You could check how your field is mapped in elasticsearch using GET http://elastic-host:port/your-index/_mapping.

Other thing to check : is there any template registered in elasticsearch that define "elapsed" field type.
GET /_template

Got it to work. thanks for your help.