Filebeat Index Available, but not Logstash Index

I recently moved my Elasticsearch service to a new node, so that I now have a separate node each for Elasticsearch, Kibana & Logstash (prior E&K were on the same server). Additionally, I upgraded Kibana from 4.4.x to 4.5.1.

Since doing this and firing up the new Elasticsearch instance, no logstash indexes are getting created, just Filebeat-2016.6.30 and Topbeat-2016.6.30. What do I need to do in order to get the Logstash index populating again?
I've attempted to copy the index name over to the proper index name using commands found here:
https://www.elastic.co/guide/en/elasticsearch/reference/current/docs-reindex.html
However, as expected, the index gets created but no longer populated.

My filebeats service doesn't set the index, and Logstash sets it through the elasticsearch output:
output {
elasticsearch {
hosts => [""]
index => "%{[@metadata][beat]}-%{+YYYY.MM.dd}"
document_type => "%{[@metadata][type]}"
}
}

Any ideas of what I need to do in order to create the logstash index? I attempt in Kibana from the 'indices' page, but it can't find the default 'logstash-*' index, as it truly doesn't exist on the backend! Help! Thanks!

Kibana:4.5.1
Logstash:2.2.4
Elasticsearch: 2.2.3

I'm not sure what you mean here. The configuration you've posted indicates that you want to have the indexes named after the Beat names. That's fine. But then you talk about indexes with names beginning with "logstash". Where's the configuration that would create those indexes?

You're right, what configuration would I need to place in order to get a
logstash index generating? I apparently need a logstash filter in order to
get .raw events in my visualizations and have a bunch of other useful
tools..

I'm trying to migrate my stack over to physical machines from VMs. When attempting to recreate the same visualizations, I'm seeing that no '.raw' events are available to choose from. This creates the problem where every string is analyzed. Thus a server name such as prod-server-02 gets broken down and analyzed as the strings: 'prod', 'server', & '02', breaking my ability to separate out events based on any string with non-alphabetical characters.

I want to just update the mapping of my current index using Sense but the problem is that I'm creating indices daily that are appended with the date.

You're right, what configuration would I need to place in order to get a logstash index generating?

An elasticsearch output where the index option beginning with "logstash-"?

I apparently need a logstash filter in order to get .raw events in my visualizations and have a bunch of other useful tools..

No, the .raw subfields have nothing to do with Logstash. Their presence is controlled via the index's mappings.

When attempting to recreate the same visualizations, I'm seeing that no '.raw' events are available to choose from.

The default index template used with Logstash's elasticsearch output only applies to indexes whose names begin with "logstash-". You could store all your logs in the same index series (using a separate index series for each beat type isn't necessarily a good idea) or you can make sure that the index templates used will apply to the indexes you've asked Logstash to create.

Thank you for your prompt reply. I have decided to simply index all logs coming through this logstash filter as logstash-* as seen below;

output {
   elasticsearch {
     hosts => ["my-server:9200"]
#    index => "%{[@metadata][beat]}-%{+YYYY.MM.dd}"
     index => "logstash-%{+YYYY.MM.dd}"
     document_type => "%{[@metadata][type]}"
  }
}

So, if I'm understanding you correctly, this should allow me to use the default indexing pattern mapping; logstash-* ? As I was doing in my previous config.. This is all so strange as I'd set it all up nearly exactly the same as before, yet none of the '.raw' events are showing up.. only real change I made was to upgrade Kibana in order to install Sense

It looks like the best solution for me would be to allow indexing based on the beat name (as you mentioned Magnus), and simply add an index template (as shown here: https://www.elastic.co/guide/en/elasticsearch/reference/current/indices-templates.html) with the specific fields I need as '.raw' renamed (in Logstash) and then reindexed with 'not_analyzed' set for strings I want to maintain in their entirety..

Magnus, does this sound like the proper approach to you? Anything you'd do differently?

Follow-up:

I've been digging through the templates on my Elastic node and found a generic Logstash template. In it there is 'dynamic templates' code referring to strings that I'm considering using;

{
"string_fields": {
"mapping": {
"index": "analyzed",
"omit_norms": true,
"fielddata": {
"format": "disabled"
},
"type": "string",
"fields": {
"raw": {
"index": "not_analyzed",
"ignore_above": 256,
"doc_values": true,
"type": "string"
}
}
},
"match_mapping_type": "string",
"match": "*"
}
},

I am hoping this will work after creating a new index.. I'll update this thread for documentation purposes as I've seen this same issue spattered across the forums recently.

So, if I'm understanding you correctly, this should allow me to use the default indexing pattern mapping; logstash-* ?

Yes.

That worked for getting back not_analyzed events, however it broke my Topbeat monitoring, as you mentioned it would. I've replaced the indexes to be created based on the beat name. I'm currently attempting to add a dynamic template as follows:

I find the template I need to modify:

GET /_template/
{ "filebeat": { "order": 0, "template": "filebeat-*", "settings": { "index": { "refresh_interval": "5s" } }, "mappings": { "_default_": { "dynamic_templates": [ { "template1": { "mapping": { "index": "not_analyzed", "ignore_above": 1024, "doc_values": true, "type": "{dynamic_type}" }, "match": "*" } } ], "properties": { "message": { "index": "analyzed", "type": "string" }, "@timestamp": { "type": "date" }, "geoip": { "dynamic": true, "properties": { "location": { "type": "geo_point" } }, "type": "object" }, "offset": { "doc_values": "true", "type": "long" } }, "_all": { "enabled": true, "norms": { "enabled": false } } } }, "aliases": {} }

As seen in this instructional (https://www.elastic.co/guide/en/elasticsearch/reference/current/indices-templates.html), I attempt to put a new dynamic mapping place:

PUT /_template/filebeat-*/template1
{
  "mappings": {
    "_default_": {
      "dynamic_templates": [
    {
      "strings": {
        "match_mapping_type": "string",
        "mapping": {
          "type": "string",
          "fields": {
            "raw": {
              "type":  "string",
              "index": "not_analyzed",
              "ignore_above": 256
            }
          }
        }
      }
    }
  ]
}
}
}

However I get the following error:

{
"error": {
"root_cause": [
{
"type": "action_request_validation_exception",
"reason": "Validation Failed: 1: template is missing;"
}
],
"type": "action_request_validation_exception",
"reason": "Validation Failed: 1: template is missing;"
},
"status": 400
}

I've attempted multiple PUT statements but keep getting a similar error.. Any ideas??

If you compare your PUT request with the example in the documentation you'll see that a) you're missing the template key and b) your URL has one level too many.