Rollovr dosen't work

Hi Community,

I've tried to setup a rollover into a new index:

curl -XPUT http://HOST/rollover- -d'
 {
     "mappings": {
       "syslog": {
         "_all": {
           "enabled": true,
           "omit_norms": true
         },
         "dynamic_templates": [
          {
            "message_field": {
              "mapping": {
                "fielddata": {
                  "format": "disabled"
                },
                "index": "analyzed",
                "omit_norms": true,
                "type": "string"
              },
              "match": "message",
              "match_mapping_type": "string"
            }
          },
          {
            "string_fields": {
              "mapping": {
                "fielddata": {
                  "format": "disabled"
                },
                "index": "analyzed",
                "omit_norms": true,
                "type": "string",
                "fields": {
                  "raw": {
                    "ignore_above": 256,
                    "index": "not_analyzed",
                    "type": "string"
                  }
                }
              },
              "match": "*",
              "match_mapping_type": "string"
            }
          }
        ],
  
 
        "properties": {
           "@timestamp": {
             "type": "date",
             "format": "strict_date_optional_time||epoch_millis"
           },
           "@version": {
             "type": "string",
             "index": "not_analyzed"
           },
           "conditions": {
            "properties": {
              "[max_age: 1h]": {
                "type": "boolean"
              }
            }
           },
           "bytes": {
             "type": "long"
           },
           "client_address": {
             "type": "ip"
           },
           "duration": {
             "type": "long"
           },
           "geoip": {
            "properties": {
              "coordinates": {
                "type": "float"
              },
              "ip": {
                "type": "ip"
              },
              "latitude": {
                "type": "float"
              },
              "location": {
                "type": "geo_point"
              },
              "longitude": {
                "type": "float"
              }
            }
          },
          "new_index": {
            "type": "string",
            "norms": {
              "enabled": false
            },
            "fielddata": {
              "format": "disabled"
            },
            "fields": {
              "raw": {
                "type": "string",
                "index": "not_analyzed",
                "ignore_above": 256
              }
            }
          },
          "old_index": {
            "type": "string",
            "norms": {
              "enabled": false
            },
            "fielddata": {
              "format": "disabled"
            },
            "fields": {
              "raw": {
                "type": "string",
                "index": "not_analyzed",
                "ignore_above": 256
              }
            }
          },
          "rolled_over": {
            "type": "boolean"
          }
 }
 }
 }
 }'

This Index do not a rollover...

What do you mean?
What version are you on?
What do your logs say?

Please, provide more information so we can help!

What do you mean?
The Index do not perform a rollover after 1h.

What version are you on?
"version" : {
"number" : "2.3.4",
"lucene_version" : "5.5.0"

What do your logs say?
Nothing in there...

There is no rollover functionality in Elasticsearch 2.3.4, so I am not sure I understand what you are expecting to happen. If you are looking to create hourly indices, this is controlled through the ingestion pipeline, e.g. Logstash.

There is however a new rollover feature available in the upcoming Elasticsearch 5.0 release. If this is what you are referring to you can read more about it in this blog post.

I'm not sure...

The default Index logstash- performs a daily rollover...

What do you mean by > is controlled through the ingestion pipeline

The Elasticsearch output in Logstash can be configured to send events to an index name based on the event timestamp. When events for a new date/time period arrives these will therefore automatically be written to a new index which is then created in Elasticsearch. This is therefore managed through Logstash, not Elasticsearch.

I've tried this option in logstash ouput cfg:

index => "index-%{+YYYY.MM.dd}"

But all my static mappings disapers...

1 Like

The default Logstash index template only applies to indices named logstash-*, so you need to modify this or create your own if you change the index name. The mapping template for ES 2.x can be found here.

1 Like

Using this templat dosen't work...

syslog_filter:
else if [program] == "elastic-wsa" {
grok {
patterns_dir => ["/etc/logstash/conf.d/patterns"]
match => [
"message" ,"%{WORD:message_type}: %{NUMBER:log_time} %{NUMBER:duration} %{IP:client_address} %{WORD:transaction_result_code}/%{NUMBER:http_result_code} %{NUMBER:bytes:int} %{WORD:http_metod} %{PROTOCOL:url_protocol}%{DOMAIN:url_domain}%{REFERER:url_referer} %{NOTSPACE:user} %{NOTSPACE:requested_server} %{NOTSPACE:response_mime_type} %{NOT_HYPHEN:acl_decision_tag}-%{NOT_HYPHEN:access_or_decryption_policy}-%{NOT_HYPHEN:identity_policy_group}-%{NOT_HYPHEN:outbound_maleware_scanning_policy_group}-%{NOT_HYPHEN:data_security_policy_group}-%{NOT_HYPHEN:external_dlp_policy_group}-%{NOT_HYPHEN:routing_policy_group} <%{NOT_COMMA:url_category},%{NOT_COMMA:wbrs},%{NOT_COMMA:webroot_verdict},%{NOT_COMMA:spyname},%{NOT_COMMA:trr},%{NOT_COMMA:threat_id},%{NOT_COMMA:trace_id},%{NOT_COMMA:mcafee_verdict},%{NOT_COMMA:mcafee_filenmae},%{NOT_COMMA:mcafee_scan_error_code},%{NOT_COMMA:mcafee_detection_type},%{NOT_COMMA:mcafee_virus_type},%{NOT_COMMA:mcafee_virus_name},%{NOT_COMMA:sophos_verdict},%{NOT_COMMA:sophos_scan_return_code},%{NOT_COMMA:sophos_file_location},%{NOT_COMMA:sophos_threat_name},%{NOT_COMMA:data_security},%{NOT_COMMA:data_loss_prevention},%{NOT_COMMA:requested_side_url_verdict},%{NOT_COMMA:response_side_url_verdict},%{NOT_COMMA:unified_inbound_dvs_verdict},%{NOT_COMMA:web_reputation_filter_type},%{NOT_COMMA:avc_application_name},%{NOT_COMMA:avc_application_type},%{NOT_COMMA:avc_application_behavior},%{NOT_COMMA:avc_safe_browsing_scanning_verdict},%{NOT_COMMA:average_bandwidth},%{NOT_COMMA:throttle_flag},%{NOT_COMMA:type_of_user},%{NOT_COMMA:unified_outbound_dvs_verdict},%{NOT_COMMA:outbound_threat_name}%{GREEDYDATA:message_body}>"
]
}

      mutate {
         #convert => [ "bytes", "integer" ]
         #convert => [ "duration", "integer" ]
         #convert => [ "client_address", "ip" ]
         }

      geoip {
         source =>   "client_address"
         target =>   "geoip"
         database => "/etc/logstash/conf.d/geo/GeoLiteCity.dat"
         add_field => [ "[geoip][coordinates]", "%{[geoip][longitude]}" ]
         add_field => [ "[geoip][coordinates]", "%{[geoip][latitude]}"  ]
         }

      mutate {
         convert => [ "[geoip][coordinates]", "float"]
         }


      }

Index:

    {
      "template" : "rollover-*",
      "settings" : {
        "index.refresh_interval" : "5s"
      },
      "mappings" : {
        "_default_" : {
          "_all" : {"enabled" : true, "omit_norms" : true},
          "dynamic_templates" : [ {
            "message_field" : {
              "match" : "message",
              "match_mapping_type" : "string",
              "mapping" : {
                "type" : "string", "index" : "analyzed", "omit_norms" : true,
                "fielddata" : { "format" : "disabled" }
              }
            }
          }, {
            "string_fields" : {
              "match" : "*",
              "match_mapping_type" : "string",
              "mapping" : {
                "type" : "string", "index" : "analyzed", "omit_norms" : true,
                "fielddata" : { "format" : "disabled" },
                "fields" : {
                  "raw" : {"type": "string", "index" : "not_analyzed", "doc_values" : true, "ignore_above" : 256}
                }
              }
            }
          }, {
            "float_fields" : {
              "match" : "*",
              "match_mapping_type" : "float",
              "mapping" : { "type" : "float", "doc_values" : true }
            }
          }, {
            "double_fields" : {
              "match" : "*",
              "match_mapping_type" : "double",
              "mapping" : { "type" : "double", "doc_values" : true }
            }
          }, {
            "byte_fields" : {
              "match" : "*",
              "match_mapping_type" : "byte",
              "mapping" : { "type" : "byte", "doc_values" : true }
            }
          }, {
            "short_fields" : {
              "match" : "*",
              "match_mapping_type" : "short",
              "mapping" : { "type" : "short", "doc_values" : true }
            }
          }, {
            "integer_fields" : {
              "match" : "*",
              "match_mapping_type" : "integer",
              "mapping" : { "type" : "integer", "doc_values" : true }
            }
          }, {
            "long_fields" : {
              "match" : "*",
              "match_mapping_type" : "long",
              "mapping" : { "type" : "long", "doc_values" : true }
            }
          }, {
            "date_fields" : {
              "match" : "*",
              "match_mapping_type" : "date",
              "mapping" : { "type" : "date", "doc_values" : true }
            }
          }, {
            "geo_point_fields" : {
              "match" : "*",
              "match_mapping_type" : "geo_point",
              "mapping" : { "type" : "geo_point", "doc_values" : true }
            }
          } ],
          "properties" : {
            "@timestamp": { "type": "date", "doc_values" : true },
            "@version": { "type": "string", "index": "not_analyzed", "doc_values" : true },
            "geoip"  : {
              "type" : "object",
              "dynamic": true,
              "properties" : {
                "ip": { "type": "ip", "doc_values" : true },
                "location" : { "type" : "geo_point", "doc_values" : true },
                "latitude" : { "type" : "float", "doc_values" : true },
                "longitude" : { "type" : "float", "doc_values" : true }
              }
            },
            "client_address": {
                 "type": "ip"
            }
          }
        }
      }
    }

client_address is a string now.

geoip.location is a float now.

Did you upload the index template? What does you output config look like?

I've confgured a new index not realy a template:

curl -xPUT http://host/index

vs.

curl -XPUT http://host/_template/temp

After using the corrct API the template works fine...