I have huge dictionary file to be used in logstash translate filter , around 180k entries. I tried to reduce it to around 80k but still always getting error when starting the pipeline , as below :
[2020-10-30T22:11:47,115][ERROR][logstash.agent ] Failed to execute action {:id=>:"dump-subsc", :action_type=>LogStash::ConvergeResult::FailedAction, :message=>"Could not execute action: PipelineAction::Create<dump-subsc>, action_result: false", :backtrace=>nil}
Translate filter in logstash pipeline as below :
translate {
field => "[imeiTac]"
destination => "[deviceName]"
dictionary_path => "/usr/share/logstash/pipeline/imei_tac.csv"
fallback => "unknown device"
refresh_interval => 0
}
Below are sample of the dictionary entries:
"01124500","iPhone A1203"
"01130000","iPhone A1203"
"01130100","iPhone A1203"
"01136400","iPhone A1203"
"01136500","iPhone A1203"
"01143400","iPhone A1203"
"01147200","iPhone A1203"
"01154600","iPhone-A1203"
"01161200","iPhone 3G A1241"
"01161300","iPhone 3G A1241"
"01161400","iPhone 3G A1241"
"01165400","iPhone-A1203"
"01171200","iPhone 3G A1241"
"01171300","iPhone 3G A1241"
"01171400","iPhone 3G A1241"
"01174200","iPhone 3G A1241"
I have tried to use around 5k entries only in dictionary , pipeline is working fine and i can get the result.
However when i added full dictionary ( around 80k entries ) pipeline is not started.
I have tried to increase the JVM from 1GB , 2GB to 4GB without success.
According to logstash translate filter documentation, it has been tested with around 100k dictionary entries.
How can i make my pipeline working with this huge dictionary entries ?
Thanks,