Logstash using not_analyzed not working


I am a total newby to the ELK stack and probably trying to setup a much to complicated config to start with... :slight_smile:

I am running the whole stack on a windows 7 laptop. and I am importing a CSV which goes well but I cannot get the string field to be NOT analysed which is giving me broken text in the kibana visualisations.

Last try was with a template.

Both the template and the conf file are located in the c:\logstash-1.5.0\bin directory.

This is the conf file:

input {  
  file {
      path => "C:\Users\jeroen\Documents\temp\CSV\ElasticSearch_Input_vc.csv"
      type => "core2"
      start_position => "beginning"      }

filter {  
csv {
    columns => ["snapshot_date_time","Country","Tower","Service","Division","USD Group","Ref Nr","Processtype","Importance","Priority","Severity","Status and Reason","Category","Is_Valid_Category","Summary","Open Date Time","Closed Date Time","Opened By","Last Modified","Resolve Completed Date Time","Hrs_Assigned_To_Completed","First Assign Date Time","Hrs_New_To_Assign","Customer Organization","Requested By","Assignee","Active Flag","In Out SLA Resolution 1"]

    separator => ";"
{ match => [ "snapshot_date_time", "yyyy-MM-dd HH:mm:ss" ] }
mutate {
convert => { "Hrs_Assigned_To_Completed" => "float" }
convert => { "Hrs_New_To_Assign" => "float" }
output {  
elasticsearch {
    action => "index"
    host => "localhost"
    index => "qdb-%{+YYYY.MM.dd}"
    workers => 1
    template => "template.json"
#stdout {
   #codec => rubydebug

And this is the template (which honestly I just copied from another topic and changed the "template name") And I am in doubt if the location on my laptop is correct or what to do with the 7th line as this is probably specific for the data used by the originator...

"template": "qdb-%{+YYYY.MM.dd}",
"settings" : {
    "number_of_shards" : 1,
    "number_of_replicas" : 0,
    "index" : {"query" : { "default_field" : "userid" } 
"mappings": {
    "_default_": { 
        "_all": { "enabled": false },
        "_source": { "compress": true },
        "dynamic_templates": [
                "string_template" : { 
                    "match" : "*",
                    "mapping": { "type": "string", "index": "not_analyzed" },
                    "match_mapping_type" : "string"
         "properties" : {
            "date" : { "type" : "date", "format": "yyyy-MM-dd HH:mm:ss"},
            "device" : { "type" : "string", "fields": {"raw": {"type":  "string","index": 
            "distance" : { "type" : "integer"}

Any help/hints/tips are appreciated!

(Magnus Bäck) #2

Your index template has "qdb-%{+YYYY.MM.dd}" as the index name pattern, but that won't work. That kind of pattern is specific to Logstash's elasticsearch output. Use "qdb-*" or "qdb-????.??.??" instead.


Hi Magnus, thx for the reply.

I changed it to "qdb-*", deleted the sincedb and all indexes in ES, reran logstash, deleted the index pattern from kibana and added the index pattern in kibana but I am still seeing almost all fields as "analyzed" in kibana.

any other idea's?

(Magnus Bäck) #4

What does the actual mapping look like for the index in question (use e.g. the get mapping API).

"qdb-2014.02.14": {
      "mappings": {
         "core2": {
            "_all": {
               "enabled": false},
            "_source": {
               "compress": true},
            "properties": {
               "@timestamp": {"type": "date", "format": "dateOptionalTime"},
               "@version": {"type": "string"},
               "Active Flag": {"type": "string"},
               "Assignee": {"type": "string"},
               "Category": {"type": "string"},
               "Closed Date Time": {"type": "string"},
               "Country": {"type": "string"},
               "Customer Organization": {"type": "string"},
               "Division": {"type": "string"},
               "First Assign Date Time": {"type": "string"},
               "Hrs_Assigned_To_Completed": {"type": "double"},
               "Hrs_New_To_Assign": {"type": "double"},
               "Importance": {"type": "string"},
               "In Out SLA Resolution 1": {"type": "string"},
               "Is_Valid_Category": {"type": "string"},
               "Last Modified": {"type": "string"},
               "Open Date Time": {"type": "string"},
               "Opened By": {"type": "string"},
               "Priority": {"type": "string"},
               "Processtype": {"type": "string"},
               "Ref Nr": {"type": "string"},
               "Requested By": {"type": "string"},
               "Resolve Completed Date Time": {"type": "string"},
               "Service": {"type": "string"},
               "Severity": {"type": "string"},
               "Status and Reason": {"type": "string"},
               "Summary": {"type": "string"},
               "Tower": {"type": "string"},
               "USD Group": {"type": "string"},
               "host": {"type": "string"},
               "message": {"type": "string"},
               "path": {"type": "string"},
               "snapshot_date_time": {"type": "string"},
               "source_host": {"type": "string","index": "not_analyzed"},
               "tags": {"type": "string","index": "not_analyzed"},
               "type": {"type": "string","index": "not_analyzed"}


I tried another approach, I uploaded a template via curl directly into elasticsearch. Removed the template lines from the conf file, deleted all indexes and sincedb and re indexed... voila. It worked...
(maybe/probably my original attempts with the template.json the template was in the wrong folder or naming was not correct...??? maybe I will find out later but for now it works :slight_smile: )

(system) #7