Error in elk

I tried to import using kv filter
my logs started importing like this..
indent preformatted text by 4 spaces
"authserver" => "a_India RADIUS",
"proto" => "6",
"devname" => "FW_1",
"10:56:12\tdate" => "2020-06-22\tlocal7\tnotice\t\ttime=10:56:11",
"host" => "kali",
"dstintf" => "wan1",
"path" => "/root/Cybrotech-/log00",
"subtype" => "webfilter",
"srcintf" => "ssl.root",
"method" => "domain",
"eventtype" => "ftgd_allow",
"hostname" => "webmail.accessarellc.net",
"cat" => "33",
"srcintfrole" => "undefined",
"dstip" => "20.73.98.154",
"type" => "utm",
"sessionid" => "677535",
"dstintfrole" => "wan",
"srcport" => "6095",
"url" => "/",
"profile" => "monitor-all",
"srcip" => "10.212.134.190",
"logid" => "07013312",
"policyid" => "17",
"eventtime" => "12803571",
"direction" => "outgoing",
"level" => "notice",
"@version" => "1",
"reqtype" => "direct",
"catdesc" => ""Health",
"action" => "passthrough",
"vd" => "root",
"dstport" => "443",
"service" => "HTTPS",
"@timestamp" => 2020-07-13T09:10:47.811Z,
"sentbyte" => "192",
"devid" => "FG0TK19907000",
"group" => "SSLVPN_Group",
"msg" => "URL belongs to an allowed category in policy",
"user" => "\ASINGH",
"rcvdbyte" => "0"
}
after some time i got this error on screen

indent preformatted text by 4 spaces

"_type"=>"doc", "_id"=>"LzhxR3MBoH6QvDEw21Sy", "status"=>400, "error"=>{"type"=>"illegal_argument_exception", "reason"=>"Limit of total fields [1000] in index [log00_210270] has been exceeded"}}}}

Take a look at https://www.elastic.co/guide/en/elasticsearch/reference/current/mapping.html#mapping-limit-settings

@warkolm okay so that means it is not possible to import data with that much indexes into elastic search..

It's possible.

This topic might help you understand a bit more on what's happening - Approaches to deal with "Limit of total fields [1000] in index has been exceeded"?

@warkolm okay thank you...so i am trying to increase my index mapping through index setting
indent preformatted text by 4 spaces

PUT my_index/_settings
{
  "index.mapping.total_fields.limit": 2000
}

as i don't get how to use dynamic mapping and flattened data structure because i have a configuration file with only kv filter in filter section

@warkolm...this is the filter section of my configuration file for the above logs
indent preformatted text by 4 spaces
filter {

kv {
remove_field => ["message"]

}

}
I am new to elastic search i have no idea where to define flattened data type...if you can help through this it would be grateful...as i had tried all the options and searched a lot about this error but i can't find anything

It seems all of your message is not in KV format. This gives you a field name that contains a time (4th from the top), which rapidly increases the number of fields.

@Christian_Dahlqvist i have my log like this

172.16.3.254 Jun 22 11:00:40 date=2020-06-22 local7 notice time=11:00:39 devname="LR_FW_1" devid="FG20TK19907000" logid="00000013" type="traffic" subtype="forward" level="notice" vd="root" eventtime=1592803839 srcip=10.22.134.155 srcport=5596 srcintf="ssl.root" srcintfrole="undefined" dstip=10.10.11.20 dstport=53 dstintf="AHCP-MIBLR_ACT" dstintfrole="undefined" poluuid="f73d12a2-622c-51ea-c684-4ec26e252a5c" sessionid=39680340 proto=17 action="accept" user="MIA\abhishy" group="SSLVPN_Group" authserver="Mia_Ina RADIUS" policyid=18 policytype="policy" service="DNS" dstcountry="Reserved" srccountry="Reserved" trandisp="noop" duration=180 sentbyte=60 rcvdbyte=153 sentpkt=1 rcvdpkt=1 vpn="AHCP-MLR_ACT" vpntype="ipsec-static" appcat="unscanned"

will you able to help me how to write a filter section for this log...i have a 7 GB something file
@warkolm @Christian_Dahlqvist

Did you look at the tutorial I linked to in your other thread?

@Christian_Dahlqvist yes i tried to change my conf file and added grok filter but it is not working with kv filter

@Christian_Dahlqvist ...I changed my filter section by this now
indent preformatted text by 4 spaces
filter {

grok {
match => { "message" => "%{IP:client}%{SYSLOGBASE}+%{GREEDYDATA:msgbody}"}
}

grok {
match => {
"msgbody" => ["%{GREEDYDATA:KV}\s"]
}
}

kv {
source => "msgbody"
field_split => " "
value_split => "="
remove_char_key => "<>,"
remove_char_value => "<>,"
trim_key => "<>,"
trim_value => "<>,"
include_brackets => false
remove_field => ["message"]

}

I hope now it is matching to the logs..just check once

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.