Can't input in Elasticsearch

So I'm an all goner here. Got no clue where my mistake is.
Here's the deal:

I'm trying to input data from a csv file in Elasticsearch 2.3.2. The input, filter and output is done by Logstash 2.3.2.

The .csv file:

timeStamp,elapsed,label,responseCode,responseMessage,threadName,success,grpThreads,allThreads,Latency,Hostname,Connect
1464772797973,1082,http://test4,200,OK,ThreadGroup 1-1,true,2,2,318,DESKTOP-NS,166
1464772798501,1011,http://test4.com,200,OK,ThreadGroup 1-2,true,3,3,185,DESKTOP-NS,94
1464772799061,889,http://test4.com,200,OK,ThreadGroup 1-1,true,3,3,87,DESKTOP-NS,0

The test.conf of Logstash:

[code]
input {
file {
type => "csv"
path => [ "C:\result\results.csv"]
start_position => "beginning"
}
}

filter {
if [elapsed] == "label"
{
drop { }
}

csv {
	columns => ["@jmeter_timestamp", "elapsed", "label", "responseCode", "responseMessage", "threadName",
            "success", "grpThreads", "allThreads", "Latency", "Hostname", "Connect"]

separator => ","
}
date {
match => [ "@jmeter_timestamp", "UNIX_MS" ]
target => "@timestamp"
timezone => "Europe/Amsterdam"
remove_field => [ "jmeter_timestamp" ]
}
}
output {
elasticsearch {
hosts => ["http://192.168.43.51:9200"]
index => "jmeter-%{+YYYY.MM.dd}"
}
} [/code]

My Index mapping:

{
    "template": "jmeter-*",
    "settings": {
        "number_of_shards": 1,
        "number_of_replicas": 0,
        "index.refresh_interval": "10s"
    },
    "mappings": {
        "logs": {
            "properties": {
                "@timestamp": {
                    "type": "date",
                    "format": "yyyyMMdd'T'HHmmss.SSSZ"
                },
                "elapsed": {
                    "type": "integer"
                },
                "label": {
                    "type": "string"
                },
                "responseCode": {
                    "type": "integer"
                },
				"responseMessage": {
                    "type": "string"
                },
                "threadName": {
                    "type": "string"
                },
                "success": {
                    "type": "boolean"
                },
                "grpThreads": {
                    "type": "long"
                },
                "allThreads": {
                    "type": "long"
                },
                "Latency": {
                    "type": "long"
                },
                "ErrorCount": {
                    "type": "long"
                },
                "Hostname": {
                    "type": "string"
                },
				"Connect": {
                    "type": "long"
                }
            }
		}
	}
}

Kibana is showing no data from any timezone: http://i.imgur.com/OMsJTXz.png
The output of stdout of Logstash is showing the following:
http://i.imgur.com/yez54yD.png

I tried to remove, and re-add the index with no success.

Can anyone pin-point me to my mistake. Thanks

Can you connect to your ES host from LS? eg with telnet

Yeah, connection-wise everything is good.
See the following:

http://imgur.com/icRo34t

Anyone?

Increasing Logstash's log level by starting it with --verbose or even --debug could give clues.

Both the messages from, --debug and --verbose, didn't give me any clues. Here's the result:
http://imgur.com/W7tbIvM (Click on image for full screen)

I tried the following:

  1. Deleted the 'jmeter-*' index pattern in Elasticsearch.
  2. Configured Logstash to input the template 'jmeter.json' in Elasticsearch.
  3. Start Logstash again, and I see the 'jmeter-' index pattern added in kopf.
  4. In Kibana I add the index pattern again.
  5. At the Discover tab, there is no data available, from 15 minutes to 5 years ago

I think the problem lays in the UNIX_MS timestamp.

The result.csv has a UNIX_MS timestamp, e.g. 1464772797973
My JSON index pattern file is set like this: "properties": { "@timestamp": { "type": "date", "format": "yyyyMMdd'T'HHmmss.SSSZ" },

And my config in Logstash is setup like this:

csv { columns => ["@jmeter_timestamp", ..."] separator => "," } date { match => [ "@jmeter_timestamp", "UNIX_MS" ] target => "@timestamp" timezone => "Europe/Amsterdam" remove_field => [ "jmeter_timestamp" ] } }

Are these settings correct?
Note: the full/original files are in the first post

Allright, last try here. I don't want to open a second thread.

I'm trying to input two values in Elasticsearch through Logstash.

The first value is a UNIX_MS timestamp, the second is simply an integer:

My test.json which Logstash sends to Elasticsearch:

{ "template": "alles-*", "settings": { "number_of_shards": 1, "number_of_replicas": 0, "index.refresh_interval": "10s" }, "mappings": { "logs": { "properties": { "@timestamp": { "type": "date", "format": "yyyyMMdd'T'HHmmss.SSS" }, "elapsed": { "type": "integer" } } } } }

And finally my test.conf file

input { file { type => "csv" path => [ "C:/result2/results.jtl"] start_position => "beginning" } } filter { csv { columns => ["@jmeter_timestamp", "elapsed"] separator => "," } date { match => [ "@jmeter_timestamp", "UNIX_MS" ] target => "@timestamp" timezone => "Europe/Amsterdam" remove_field => [ "jmeter_timestamp" ] } } output { elasticsearch { template => "c:/result2/jmeter.json" template_name => "alles" hosts => ["192.168.43.51:9200"] index => "alles-%{+YYYY.MM.dd}" } }

When I run: logstash -f test.conf -- debug or --verbose it shows no errors.
Still, no data is added in Elasticsearch. But the JSON template is added.

http://imgur.com/ONlpl9i

Logstash is probably tailing the input file since it already processed the data once. Set the file input's sincedb_path option to "nul" or use the stdin input and pipe the data to Logstash. If you stick to the file input make sure you also adjust ignore_older if the input file is older than 24 hours. This topic here a couple of times a week so there's plenty more in the archives (and sincedb is described in the file input documentation).

There are a couple of weird things in your date filter but let's deal with that later.

This is so weird... I adjusted my Logstash config to:

input { file { sincedb_path => "NUL" ignore_older => 0 type => "csv" path => [ "C:/result2/results.jtl"] start_position => "beginning" } }

And now, it shows data in Elasticsearch. I guess Logstash only took data from cache.
Thanks @magnusbaeck