Index creation failed

Hi Experts,

I am new to Elasticsearch and facing some error while creating the index.
my flow is like filebeat --> logstash --> elasticsearch --> kibana.

I am trying to read the SOA (weblogic) logs, below is my logstash conf file.

input:

input {
beats {
port => 5044
ssl => true
ssl_certificate => "/etc/pki/tls/certs/logstash-forwarder.crt"
ssl_key => "/etc/pki/tls/private/logstash-forwarder.key"
}
}

Filter:

  filter {
  if [message] !~ /^####/ {
	drop {
	}
  }

  grok {
	match => { "message" => "\#\#\#\#\<%{DATA:msg_timestamp}\> \<%{DATA:msg_severity}\> \<%{DATA:msg_subsystem}\>%{GREEDYDATA:msg_details}" }
  }
  date {
	match => [ "msg_timestamp", "MMM dd yyyy" ]
  }
}

output:

output {
elasticsearch {
hosts => ["xx.xxx.xx.xxx:9200"]
sniffing => true
manage_template => false
index => "te_%{[@metadata][beat]}-%{+YYYY.MM.dd}"
document_type => "%{[@metadata][type]}"
}
stdout { codec => rubydebug }
}

Index Creation:

PUT _template/template_1
{
  "index_patterns": ["te*", "bar*"],
  "settings": {
	"number_of_shards": 1
  },
   "msg_timestamp": {
		  "type": "date",
		  "format": "MMM dd yyyy"
		}
	  }

using the above files i am facing the below error:

[2018-12-06T16:01:34,626][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"te_filebeat-2018.12.06", :_type=>"doc", :_routing=>nil}, #LogStash::Event:0x6341c77b], :response=>{"index"=>{"_index"=>"te_filebeat-2018.12.06", "_type"=>"doc", "_id"=>"ASwSg2cBLCbRLhh1nfvO", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse [msg_timestamp]", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"Invalid format: "Mar 16, 2018 12:59:33 AM CDT""}}}}}

What is the best way to create the index

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.