Greetings, and thank you in advance for your support!
Could you please help us with the syntax of our ES 7.10 logstash.conf file to parse an application log file via filebeat that was tested on ES 6.6, but we are now using ES 7.10. On ES 7.10 the below logstash.conf file is generating these errors:
[2020-12-02T10:31:46,069][WARN ][logstash.outputs.elasticsearch][main][cfbfbf86e890aa06ae3b35658a4e19217e126ff790e6fea381e1fb891c32962b] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"logstash-2020.12.337", :routing=>nil, :_type=>"%{[document_type]}"}, #LogStash::Event:0x28bb48e0], :response=>{"index"=>{"_index"=>"logstash-2020.12.337", "_type"=>"%{[document_type]}", "_id"=>"vqC5JHYBcn0aFO65gmiS", "status"=>400, "error"=>{"type"=>"illegal_argument_exception", "reason"=>"mapper [@version] cannot be changed from type [keyword] to [text]"}}}}
Could you please help us clean up the below logstash.conf file to resolve the errors?
Configuring Logstash
Under the /etc/logstash/conf.d directory we created a file logstash.conf with following:
input {
beats {
port => 5044
}
}
filter {
if ([document_type] == "cloudian-request-info") {
urldecode {
field => "message"
}
csv {
id => "cloudian-request-info"
autogenerate_column_names => false
separator => "|"
columns => [
"timestamp",
"ipAddress",
"bucketOwnerUserId",
"operation",
"bucketName",
"contentAccessorUserID",
"requestHeaderSize",
"requestBodySize",
"responseHeaderSize",
"responseBodySize",
"totalRequestResponseSize",
"durationMsec",
"objectName",
"httpStatus",
"s3RequestID",
"eTag",
"errorCode",
"copySource"
]
convert => {"requestHeaderSize" => "integer"}
convert => {"requestBodySize" => "integer"}
convert => {"responseHeaderSize" => "integer"}
convert => {"responseBodySize" => "integer"}
convert => {"totalRequestResponseSize" => "integer"}
convert => {"durationMsec" => "integer"}
remove_field => "message"
}
date {
match => ["timestamp", "ISO8601"]
remove_field => [ "timestamp" ]
}
geoip {
source => "ipAddress"
}
}
}
output {
if "_csvparsefailure" not in [tags] and "_dateparsefailure" not in [tags] {
elasticsearch {
hosts => [ "localhost:9200" ]
document_type => "%{[document_type]}"
index => "logstash-%{+YYYY.MM.DD}"
}
}
}
This is the logfile format:
yyyy-mm-dd HH:mm:ss,SSS|IpAddressOfClientS3Server|S3RequestId|
HttpStatus|HttpOperation|OriginalUri|HyperStoreFilePath|ContentLength|
DurationMicrosecs|Etag
Log file example
2016-10-27 15:18:18,031|10.20.2.52|6e4c6884-a4a2-1238-a908-525400c5e557|
200|PUT|/file/CloudianTest1%2Fbuser1%2Ftest100b|
/cloudian2/hsfs/1IjBeBudSCVmsYbKdPV8Ns/4a3ceb36ee344e1ebd43ed413b310bc8/046/
075/56017837606746367338485930470043970723.1477552697400|
100|854411|7b2a7abdfdaa1a01c33432b5c41e0939
Direct link to document: https://s3.cloudianhyperstore.com/downloads/HyperStore/7/7.2.3/doc/HyperStoreHelp/HyperStoreHelp.html#Logging/LoggingAnalysis.html%3FTocPath%3D10.%20Logging|_____4
Thank you!