Help Re-indexing Elasticsearch with Logstash

Greetings All,
I am extremely new to logging and the whole ELK stack world. I've build my ELK stack install using the [DigitalOcean instructions][1]. I've been logging and tweaking things for a month or so and I'm really excited about the results and discoveries I'm making. Recently this error appeared in the settings tab for logstash-*:

Mapping conflict! 4 fields are defined as several types (string, integer, etc) across the indices that match this pattern. You may still be able to use these conflict fields in parts of Kibana, but they will be unavailable for functions that require Kibana to know their type. Correcting this issue will require reindexing your data.

Changing the sort order on the type field revealed that the 4 fields in conflict are:

src_port conflict
severity conflict
vlan conflict
dest_port conflict

My first thought was "Help! I've broken it". However after doing some reading it seems that reindexing is quite a normal practice and that using Logstash to do it is the easiest way to go.
I've found a Logstash config that looks promising [here][2]

curl -XGET localhost:9200/_cat/indices?v
health status index               pri rep docs.count docs.deleted store.size pri.store.size
yellow open   logstash-2015.10.14   5   1    1140646            0        1gb            1gb
yellow open   logstash-2015.09.30   5   1     150459            0    167.3mb        167.3mb

                      ...many lines removed to reduce the size of this post.

yellow open   .kibana               1   1         28            1    206.1kb        206.1kb
yellow open   logstash-2015.09.16   5   1        252            0    803.7kb        803.7kb
yellow open   logstash-2015.10.09   5   1     143231            0      157mb          157mb
yellow open   logstash-2015.10.03   5   1      87553            0     99.8mb         99.8mb
yellow open   logstash-2015.10.11   5   1     830334            0    690.2mb        690.2mb

I'm really not sure how to go about this or if what I'm proposing to do is going to solve this problem.
Where should I look to collect the right information to help find a solution?

What goes in the @metadata space?

input {
  elasticsearch {
    hosts => [ "localhost" ]
    port => "9200"
    index => "logstash-2015.10.11"
    size => 1000
    scroll => "5m"
    docinfo => true
    scan => true
  }
}

output {
  elasticsearch {
    host => "localhost"
    protocol => "http"
    index => "%{[@metadata][_index]}"
    index_type => "%{[@metadata][_type]}"
    document_id => "%{[@metadata][_id]}"
  }
  stdout {
    codec => "dots"
  }
}

If more information is needed then please give me the curl command and I'll happily provide it.
Much thanks in advance.
[1]: How To Install Elasticsearch, Logstash, and Kibana (ELK Stack) on Ubuntu 14.04 | DigitalOcean
[2]: Reindexing Elasticsearch with Logstash 2.0 · GitHub

Take a read of this blog post for more info on what @metadata is about - https://www.elastic.co/blog/logstash-metadata

Short answer, nothing, LS uses that to figure out important stuff :slight_smile: