When I am trying to use logstash to read through a configuration file, I come up with map parsing error.
:response=>{"index"=>{"_index"=>"logstash-2016.06.07",
"_type"=>"txt", "_id"=>nil, "status"=>400,
"error"=>{"type"=>"mapper_parsing_exception", "r eason"=>"Failed to
parse mapping [default ]: Mapping definition for [data] has
unsupported parameters: [ignore_above : 1024]",
"caused_by"=>{"type"=>"mapper_parsing_exception", "reason"=>"Mapping
definition for [data] has unsupported para meters: [ignore_above :
1024]"}}}}, :level=>:warn}←[0m
I found that there is no problem is groking my logs but just do not know what is the matter of the error.
Here is my logstash.conf
input{
stdin{}
file{
type => "txt"
path => "C:\HA\accesslog\trial.log"
start_position => "beginning"
}
}
filter{
grok{
match => {"message" => ["%{IP:ClientAddr}%{SPACE}%{NOTSPACE:date}%{SPACE}%{TIME:time}%{SPACE}%{NOTSPACE:x-eap.wlsCustomLogField.VirtualHost}%{SPACE}%{WORD:cs-method}%{SPACE}%{PATH:cs-uri-stem}%{SPACE}%{PROG:x-eap.wlsCustomLogField.Protocol}%{SPACE}%{NUMBER:sc-status}%{SPACE}%{NUMBER:bytes}%{SPACE}%{NOTSPACE:x-eap.wlsCustomLogField.RequestedSessionId}%{SPACE}%{PROG:x-eap.wlsCustomLogField.Ecid}%{SPACE}%{NUMBER:x-eap.wlsCustomLogField.ThreadId}%{SPACE}%{NUMBER:x-eap.wlsCustomLogField.EndTs}%{SPACE}%{NUMBER:time-taken}"]}
}
}
output{
elasticsearch { hosts => ["localhost:9200"]
}
stdout { codec => rubydebug }
}
Please help. Thanks.
warkolm
(Mark Walkom)
June 8, 2016, 12:39am
2
That is why.
What does your mapping look like?
A sample line of my data is like that:
167.57.21.143 2016-05-26 00:00:12 cms-corp-aeis-prd.server.ab.org.au:32028 POST /aeis/7_0_0/aeis/timecapture.do HTTP/1.1 200 109 JOvopGXLSq4o-tWMV8Oj-6aDSILdcL3jd9bsyUiaI6a3vFO5o7dm!449867976!1464191968906 3eec121a91ac23b8:2be0a5e5:1533d1684e1:-7f72-00000000005fc371 4109 1464192012809 0.238
And when I run the file, it did give out the grok parse result, even have mapping error
{
"message" => "167.57.21.143 2016-05-26 00:00:12 cms-corp-aeis-prd.server.ab.org.au:32028 POST /aeis/7_0_0/aeis/timecapture.do\tHTTP/1.1 200 109 JOvopGXLSq4o-tWMV8Oj-6aDSILdcL3jd9bsyUiaI6a3vFO5o7dm!449867976!1464191968906 3eec121a91ac23b8:2be0a5e5:1533d1684e1:-7f72-00000000005fc371 4109 1464192012809 0.238\r",
"@version " => "1",
"@timestamp " => "2016-06-07T17:03:20.752Z",
"path" => "C:\HA\accesslog\trial.log",
"host" => "WIN-07LLQEN2SJB",
"type" => "txt",
"ClientAddr" => "167.57.21.143",
"date" => "2016-05-26",
"time" => "00:00:12",
"x-eap.wlsCustomLogField.VirtualHost" => "cms-corp-aeis-prd.server.ab.org.au:32028",
"cs-method" => "POST",
"cs-uri-stem" => "/aeis/7_0_0/aeis/timecapture.do",
"x-eap.wlsCustomLogField.Protocol" => "HTTP/1.1",
"sc-status" => "200",
"bytes" => "109",
"x-eap.wlsCustomLogField.RequestedSessionId" => "JOvopGXLSq4o-tWMV8Oj-6aDSILdcL3jd9bsyUiaI6a3vFO5o7dm!449867976!1464191968906",
"x-eap.wlsCustomLogField.Ecid" => "3eec121a91ac23b8:2be0a5e5:1533d1684e1:-7f72-00000000005fc371",
"x-eap.wlsCustomLogField.ThreadId" => "4109",
"x-eap.wlsCustomLogField.EndTs" => "1464192012809",
"time-taken" => "0.238"
}
I turn out o find out the sollution here: github.com/elastic/elasticsearch/issues/16283
Another problem is the created field for indexing is too long. Shortening the name can solve the issue.