Logstash with cloudfront

Hi,

i am facing issue while i am using following code in my logstash config file.

my other logs perfectly land into elasticsearch but somehow this filter didn't enter into elasticsearch.
If i used following code separately it works perfectly but when i merge that code with existing code it didn't work

someone please help me to sort out this issue it make my life hell.
Other code have some ELB entries if you want i will share it with masking entry.

//
if [type] == "yayacloudfront" {
grok {
match => { "message" => "%{DATE:date}\t%{TIME:time}\t%{DATA:x_edge_location}\t(?:%{NUMBER:sc_bytes_int}|-)\t%{IPORHOST:ipactual}\t%{WORD:cs_method}\t%{HOSTNAME:cs_host}\t%{NOTSPACE:cs_uri_stem}\t%{NUMBER:elb-status_code}\t%{GREEDYDATA:referrer}\t%{GREEDYDATA:httpuseragent}\t%{GREEDYDATA:cs_uri_query}\t%{GREEDYDATA:cookies}\t%{WORD:x_edge_result_type}\t%{NOTSPACE:x_edge_request_id}\t%{HOSTNAME:domain_name}\t%{URIPROTO:httpprotocol}\t%{INT:cs_bytes_int}\t%{NUMBER:time_taken_float}\t%{GREEDYDATA:x_forwarded_for}\t%{GREEDYDATA:ssl_protocol}\t%{GREEDYDATA:ssl_cipher}\t%{GREEDYDATA:x_edge_response_result_type}\t%{GREEDYDATA:cs_protocol_version}\t%{GREEDYDATA:fle_status}\t%{GREEDYDATA:fle_encrypted_fields}" }
}

mutate {
add_field => { "timestampnew" => "%{date} %{time}" }
}
date {
match => [ "timestampnew", "YYYY-MM-dd'T'HH:mm:ssZ" ]
locale => "en"
}

geoip { source => "ipactual"
target => "geoip"
}
useragent { source => "httpuseragent" }
mutate {
rename => { "name" => "agentname" }
rename => { "device" => "agentdevice" }
rename => { "os_name" => "agentosname" }
}

mutate {
remove_field => ["date", "time"]
}

}

output {

elasticsearch {
user => "elastic"
hosts => localhost
index => "all.com-%{+YYYY.MM.dd}"
template_name => "yaya.com"
template => "/etc/logstash/conf.d/elb.json"
manage_template => true
template_overwrite => true
}
}

//

[WARN ] 2019-09-02 23:40:38.031 [[main]>worker0] elasticsearch - Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"yaya.com-2019.09.02", :_type=>"doc", :routing=>nil}, #LogStash::Event:0x5d56ac68], :response=>{"index"=>{"_index"=>"yaya.com-2019.09.02", "_type"=>"doc", "_id"=>"xxxxxxxxxx", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse field [timestampnew] of type [date]", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"Invalid format: "19-08-29 09:14:00" is malformed at "-08-29 09:14:00""}}}}}

If your index template does not specify a format for the timestampnew field, then the default is

"strict_date_optional_time||epoch_millis"

A strict_date would start with a 4 digit year, and epoch millis would typically be a 13-digit number, so either way it fails when it hits the first hyphen.

Specify a format in your template.

Thanks @Badger

I have tried to remove
date {
match => [ "timestampnew", "YYYY-MM-dd'T'HH:mm:ssZ" ]
locale => "en"
}
and its all things work perfectly and shown timestampnew field in kibana, but when i add date module it again stop parsing and didn't add into elasticsearch i tried to use rubydebug but it didn't map my new timestampnew date into @timestamp

i have tried all stuff but didn't work

#date {

match => ["timestampnew" , "ISO8601", "yyyy-MM-dd HH:mm:ss.SSS"]

target => "@timestamp"

#}

#date {
#match => [ "timestampnew", "MMM dd HH:mm:ss", "MMM d HH:mm:ss" ]

##0019-09-04T08:25:14.000Z

match => [ "timestampnew", "yyyy-MM-dd'T'HH:mm:ss'.'SSSZ" ]

target => "@timestamp"

}

but nothing works.

You need to change your elasticsearch configuration (the index template) not your logstash configuration.

Thanks @Badger,

I have sort out this issue using this value
date {
match => [ "timestampnew", "yy-MM-dd'T'HH:mm:ss" ]
}

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.