Translate plugin breaking my pipelines

After updating to 8.16 (and today to 8.17 to see if the issue would be fixed), all pipelines that use translate filter plugin started breaking

[2024-12-17T15:38:27,060][ERROR][logstash.agent ] Failed to execute action {:id=>:frontend, :action_type=>LogStash::ConvergeResult::FailedAction, :message=>"Could not execute action: PipelineAction::Create<frontend>, action_result: false", :backtrace=>nil}

Even enabling debug on logstash didnt give any useful information. The only way I know that this plugin was the cause is because I noticed all pipelines that didnt use it were working, so I removed the configuration and all pipelines started working again

logstash-filter-translate (3.4.2)
ii logstash 1:8.17.0-1 amd64 An extensible logging pipeline

Anyone has any idea if I am right and this is indeed an issue with the plugin and how I can circunvent it? I really need it to make some of my log information user friendly (now I am using a workaround that is not viable in the long run...)

If more information is needed, please let me know, but i really need some help here :frowning:

Hello and welcome,

After updating to 8.16 (and today to 8.17 to see if the issue would be fixed)

From which version did you upgrade?

You would have other log lines close to this one with more context what was the issue.

Can you share more lines and also share one of your pipelines?

Hello @leandrojmp , thank you very much for your answer!

From which version did you upgrade?

8.15

You would have other log lines close to this one with more context what was the issue.

Here are part of the logs from when I turned on debug (it generated a lot of lines, so I am pasting the ones closer to the error. Let me know if you need more information

[2024-12-17T14:18:38,099][INFO ][logstash.javapipeline    ] Pipeline `as` is configured with `pipeline.ecs_compatibility: v8` setting. All plugins in this pipeline will default to `ecs_compatibility => v8` unless explicitly configured otherwise.
[2024-12-17T14:18:38,102][INFO ][logstash.javapipeline    ] Pipeline `frontend` is configured with `pipeline.ecs_compatibility: v8` setting. All plugins in this pipeline will default to `ecs_compatibility => v8` unless explicitly configured otherwise.
[2024-12-17T14:18:38,103][INFO ][logstash.javapipeline    ] Pipeline `zabbix` is configured with `pipeline.ecs_compatibility: v8` setting. All plugins in this pipeline will default to `ecs_compatibility => v8` unless explicitly configured otherwise.
[2024-12-17T14:18:38,105][DEBUG][org.logstash.execution.AbstractPipelineExt] Flow metric registered: `input_throughput` in namespace `[:stats, :pipelines, :zabbix, :flow]`
[2024-12-17T14:18:38,106][DEBUG][org.logstash.execution.AbstractPipelineExt] Flow metric registered: `filter_throughput` in namespace `[:stats, :pipelines, :zabbix, :flow]`
[2024-12-17T14:18:38,106][DEBUG][org.logstash.execution.AbstractPipelineExt] Flow metric registered: `output_throughput` in namespace `[:stats, :pipelines, :zabbix, :flow]`
[2024-12-17T14:18:38,107][DEBUG][org.logstash.execution.AbstractPipelineExt] Flow metric registered: `queue_backpressure` in namespace `[:stats, :pipelines, :zabbix, :flow]`
[2024-12-17T14:18:38,107][DEBUG][org.logstash.execution.AbstractPipelineExt] Flow metric registered: `worker_concurrency` in namespace `[:stats, :pipelines, :zabbix, :flow]`
[2024-12-17T14:18:38,108][DEBUG][org.logstash.execution.AbstractPipelineExt] Flow metric registered: `worker_utilization` in namespace `[:stats, :pipelines, :zabbix, :flow]`
[2024-12-17T14:18:38,109][DEBUG][org.logstash.execution.AbstractPipelineExt] Flow metric registered: `input_throughput` in namespace `[:stats, :pipelines, :as, :flow]`
[2024-12-17T14:18:38,109][DEBUG][org.logstash.execution.AbstractPipelineExt] Flow metric registered: `filter_throughput` in namespace `[:stats, :pipelines, :as, :flow]`
[2024-12-17T14:18:38,110][DEBUG][org.logstash.execution.AbstractPipelineExt] Flow metric registered: `throughput` in namespace `[:stats, :pipelines, :zabbix, :plugins, :inputs, :"0b9e0b0bd3ac93d6cd2a380ff5d3865463a9a56302d1157db6de7ad6fc14522a", :flow]`
[2024-12-17T14:18:38,110][DEBUG][org.logstash.execution.AbstractPipelineExt] Flow metric registered: `worker_millis_per_event` in namespace `[:stats, :pipelines, :zabbix, :plugins, :filters, :"71fad01016a9273cd72416418aede6fd4c221c0f618980d2e0a9c6b9dc079338", :flow]`
[2024-12-17T14:18:38,111][DEBUG][org.logstash.execution.AbstractPipelineExt] Flow metric registered: `worker_utilization` in namespace `[:stats, :pipelines, :zabbix, :plugins, :filters, :"71fad01016a9273cd72416418aede6fd4c221c0f618980d2e0a9c6b9dc079338", :flow]`
[2024-12-17T14:18:38,111][DEBUG][org.logstash.execution.AbstractPipelineExt] Flow metric registered: `worker_millis_per_event` in namespace `[:stats, :pipelines, :zabbix, :plugins, :filters, :"8ed2dca6b4818339632f6ced8e8873771827471051b8523cebb3b668aba022fb", :flow]`
[2024-12-17T14:18:38,112][DEBUG][org.logstash.execution.AbstractPipelineExt] Flow metric registered: `worker_utilization` in namespace `[:stats, :pipelines, :zabbix, :plugins, :filters, :"8ed2dca6b4818339632f6ced8e8873771827471051b8523cebb3b668aba022fb", :flow]`
[2024-12-17T14:18:38,112][DEBUG][org.logstash.execution.AbstractPipelineExt] Flow metric registered: `worker_millis_per_event` in namespace `[:stats, :pipelines, :zabbix, :plugins, :filters, :f8a5b09ad5940d03acf7621bc0bab465febff8be64d407f9daf5c81371fb488e, :flow]`
[2024-12-17T14:18:38,113][DEBUG][org.logstash.execution.AbstractPipelineExt] Flow metric registered: `worker_utilization` in namespace `[:stats, :pipelines, :zabbix, :plugins, :filters, :f8a5b09ad5940d03acf7621bc0bab465febff8be64d407f9daf5c81371fb488e, :flow]`
[2024-12-17T14:18:38,113][DEBUG][org.logstash.execution.AbstractPipelineExt] Flow metric registered: `worker_millis_per_event` in namespace `[:stats, :pipelines, :zabbix, :plugins, :filters, :f0cbdbacf03f2442bd6150f6210412485349533efc27cf327110154dd6529242, :flow]`
[2024-12-17T14:18:38,114][DEBUG][org.logstash.execution.AbstractPipelineExt] Flow metric registered: `input_throughput` in namespace `[:stats, :pipelines, :frontend, :flow]`
[2024-12-17T14:18:38,114][DEBUG][org.logstash.execution.AbstractPipelineExt] Flow metric registered: `output_throughput` in namespace `[:stats, :pipelines, :as, :flow]`
[2024-12-17T14:18:38,114][DEBUG][org.logstash.execution.AbstractPipelineExt] Flow metric registered: `worker_utilization` in namespace `[:stats, :pipelines, :zabbix, :plugins, :filters, :f0cbdbacf03f2442bd6150f6210412485349533efc27cf327110154dd6529242, :flow]`
[2024-12-17T14:18:38,115][DEBUG][org.logstash.execution.AbstractPipelineExt] Flow metric registered: `worker_millis_per_event` in namespace `[:stats, :pipelines, :zabbix, :plugins, :filters, :"5b0d1f7b4879f57be109a65f5810304c3d8452c0bf244ff50d1fd1faa606ffee", :flow]`
[2024-12-17T14:18:38,115][DEBUG][org.logstash.execution.AbstractPipelineExt] Flow metric registered: `worker_utilization` in namespace `[:stats, :pipelines, :zabbix, :plugins, :filters, :"5b0d1f7b4879f57be109a65f5810304c3d8452c0bf244ff50d1fd1faa606ffee", :flow]`
[2024-12-17T14:18:38,116][DEBUG][org.logstash.execution.AbstractPipelineExt] Flow metric registered: `worker_millis_per_event` in namespace `[:stats, :pipelines, :zabbix, :plugins, :filters, :"920445bc2788fad791552d44248458b91d7248ffc7750ae5c5306b68239da313", :flow]`
[2024-12-17T14:18:38,116][DEBUG][org.logstash.execution.AbstractPipelineExt] Flow metric registered: `worker_utilization` in namespace `[:stats, :pipelines, :zabbix, :plugins, :filters, :"920445bc2788fad791552d44248458b91d7248ffc7750ae5c5306b68239da313", :flow]`
[2024-12-17T14:18:38,117][DEBUG][org.logstash.execution.AbstractPipelineExt] Flow metric registered: `filter_throughput` in namespace `[:stats, :pipelines, :frontend, :flow]`
[2024-12-17T14:18:38,117][DEBUG][org.logstash.execution.AbstractPipelineExt] Flow metric registered: `worker_millis_per_event` in namespace `[:stats, :pipelines, :zabbix, :plugins, :filters, :"32a35fffea719d6b935d06bd6cb38847920aee952d06e97b5c6c417f666a188e", :flow]`
[2024-12-17T14:18:38,117][DEBUG][org.logstash.execution.AbstractPipelineExt] Flow metric registered: `output_throughput` in namespace `[:stats, :pipelines, :frontend, :flow]`
[2024-12-17T14:18:38,118][DEBUG][org.logstash.execution.AbstractPipelineExt] Flow metric registered: `queue_backpressure` in namespace `[:stats, :pipelines, :as, :flow]`
[2024-12-17T14:18:38,118][DEBUG][org.logstash.execution.AbstractPipelineExt] Flow metric registered: `worker_utilization` in namespace `[:stats, :pipelines, :zabbix, :plugins, :filters, :"32a35fffea719d6b935d06bd6cb38847920aee952d06e97b5c6c417f666a188e", :flow]`
[2024-12-17T14:18:38,118][DEBUG][org.logstash.execution.AbstractPipelineExt] Flow metric registered: `worker_concurrency` in namespace `[:stats, :pipelines, :as, :flow]`
[2024-12-17T14:18:38,118][DEBUG][org.logstash.execution.AbstractPipelineExt] Flow metric registered: `worker_utilization` in namespace `[:stats, :pipelines, :as, :flow]`
[2024-12-17T14:18:38,119][DEBUG][org.logstash.execution.AbstractPipelineExt] Flow metric registered: `throughput` in namespace `[:stats, :pipelines, :as, :plugins, :inputs, :f3cdceba9c18ced4cbd06ffe393b095411e9b29e0836c0c7663a80ca0dbcfd5d, :flow]`
[2024-12-17T14:18:38,120][DEBUG][org.logstash.execution.AbstractPipelineExt] Flow metric registered: `worker_millis_per_event` in namespace `[:stats, :pipelines, :zabbix, :plugins, :outputs, :a75c5e722a93a192fbc3a80182efc7e23b3594d60c94f796d8eb5838abf0ef00, :flow]`
[2024-12-17T14:18:38,120][DEBUG][org.logstash.execution.AbstractPipelineExt] Flow metric registered: `queue_backpressure` in namespace `[:stats, :pipelines, :frontend, :flow]`
[2024-12-17T14:18:38,120][DEBUG][org.logstash.execution.AbstractPipelineExt] Flow metric registered: `worker_millis_per_event` in namespace `[:stats, :pipelines, :as, :plugins, :filters, :"6be1005fd33ac7da5edf530a8721223a5957f4dad5ac371326a31e299b2c263e", :flow]`
[2024-12-17T14:18:38,120][DEBUG][org.logstash.execution.AbstractPipelineExt] Flow metric registered: `worker_concurrency` in namespace `[:stats, :pipelines, :frontend, :flow]`
[2024-12-17T14:18:38,121][DEBUG][org.logstash.execution.AbstractPipelineExt] Flow metric registered: `worker_utilization` in namespace `[:stats, :pipelines, :zabbix, :plugins, :outputs, :a75c5e722a93a192fbc3a80182efc7e23b3594d60c94f796d8eb5838abf0ef00, :flow]`
[2024-12-17T14:18:38,121][DEBUG][org.logstash.execution.AbstractPipelineExt] Flow metric registered: `worker_utilization` in namespace `[:stats, :pipelines, :as, :plugins, :filters, :"6be1005fd33ac7da5edf530a8721223a5957f4dad5ac371326a31e299b2c263e", :flow]`
[2024-12-17T14:18:38,121][DEBUG][logstash.javapipeline    ] Starting pipeline {:pipeline_id=>"zabbix"}
[2024-12-17T14:18:38,121][DEBUG][org.logstash.execution.AbstractPipelineExt] Flow metric registered: `worker_millis_per_event` in namespace `[:stats, :pipelines, :as, :plugins, :filters, :"86f7ef85a8c505a7feaf2efaf8fc32b479e58bcf5ba2596a3f3a044db6a67fb2", :flow]`
[2024-12-17T14:18:38,122][DEBUG][org.logstash.execution.AbstractPipelineExt] Flow metric registered: `worker_utilization` in namespace `[:stats, :pipelines, :as, :plugins, :filters, :"86f7ef85a8c505a7feaf2efaf8fc32b479e58bcf5ba2596a3f3a044db6a67fb2", :flow]`
[2024-12-17T14:18:38,122][DEBUG][org.logstash.execution.AbstractPipelineExt] Flow metric registered: `worker_millis_per_event` in namespace `[:stats, :pipelines, :as, :plugins, :filters, :"13fe7081df52ec9994969b42d9bdd49f05394f9904b1d14b1a2c71c9d38d9e58", :flow]`
[2024-12-17T14:18:38,123][DEBUG][org.logstash.execution.AbstractPipelineExt] Flow metric registered: `worker_utilization` in namespace `[:stats, :pipelines, :as, :plugins, :filters, :"13fe7081df52ec9994969b42d9bdd49f05394f9904b1d14b1a2c71c9d38d9e58", :flow]`
[2024-12-17T14:18:38,124][DEBUG][org.logstash.execution.AbstractPipelineExt] Flow metric registered: `worker_millis_per_event` in namespace `[:stats, :pipelines, :as, :plugins, :filters, :"793791a98e62acb004502b7d4a350c3a1fbf9364842c0d997eca12b41230cb56", :flow]`
[2024-12-17T14:18:38,124][DEBUG][org.logstash.execution.AbstractPipelineExt] Flow metric registered: `worker_utilization` in namespace `[:stats, :pipelines, :as, :plugins, :filters, :"793791a98e62acb004502b7d4a350c3a1fbf9364842c0d997eca12b41230cb56", :flow]`
[2024-12-17T14:18:38,125][DEBUG][org.logstash.execution.AbstractPipelineExt] Flow metric registered: `worker_millis_per_event` in namespace `[:stats, :pipelines, :as, :plugins, :filters, :"0416e2df3b5b0b885955d27aa613fc39ac9c161ac705424a7b43f7109825c73a", :flow]`
[2024-12-17T14:18:38,125][DEBUG][org.logstash.execution.AbstractPipelineExt] Flow metric registered: `worker_utilization` in namespace `[:stats, :pipelines, :as, :plugins, :filters, :"0416e2df3b5b0b885955d27aa613fc39ac9c161ac705424a7b43f7109825c73a", :flow]`
[2024-12-17T14:18:38,126][DEBUG][org.logstash.execution.AbstractPipelineExt] Flow metric registered: `worker_millis_per_event` in namespace `[:stats, :pipelines, :as, :plugins, :filters, :ff1c2dd1002e59b4d84a9984c4caccef9d36600313fffa0263a2c533c6501971, :flow]`
[2024-12-17T14:18:38,126][DEBUG][org.logstash.execution.AbstractPipelineExt] Flow metric registered: `worker_utilization` in namespace `[:stats, :pipelines, :as, :plugins, :filters, :ff1c2dd1002e59b4d84a9984c4caccef9d36600313fffa0263a2c533c6501971, :flow]`
[2024-12-17T14:18:38,127][DEBUG][org.logstash.execution.AbstractPipelineExt] Flow metric registered: `worker_utilization` in namespace `[:stats, :pipelines, :frontend, :flow]`
[2024-12-17T14:18:38,132][DEBUG][org.logstash.execution.AbstractPipelineExt] Flow metric registered: `worker_millis_per_event` in namespace `[:stats, :pipelines, :as, :plugins, :filters, :c1e382f860aa1912477d306536ecfbebe0a01d554db1e4b406829776bd24fce5, :flow]`
[2024-12-17T14:18:38,133][DEBUG][org.logstash.execution.AbstractPipelineExt] Flow metric registered: `throughput` in namespace `[:stats, :pipelines, :frontend, :plugins, :inputs, :"7d01d9cafbe913afaff3bd7c976bb36744bfbe4fd2f2ad05acf78ebd04e62065", :flow]`
[2024-12-17T14:18:38,134][DEBUG][org.logstash.execution.AbstractPipelineExt] Flow metric registered: `worker_utilization` in namespace `[:stats, :pipelines, :as, :plugins, :filters, :c1e382f860aa1912477d306536ecfbebe0a01d554db1e4b406829776bd24fce5, :flow]`
[2024-12-17T14:18:38,135][DEBUG][org.logstash.execution.AbstractPipelineExt] Flow metric registered: `worker_millis_per_event` in namespace `[:stats, :pipelines, :frontend, :plugins, :filters, :ff6dd75e5d378a40759ed811a4d5c889fa856511e20b236af9cf0f8b10d005a6, :flow]`
[2024-12-17T14:18:38,136][DEBUG][org.logstash.execution.AbstractPipelineExt] Flow metric registered: `worker_millis_per_event` in namespace `[:stats, :pipelines, :as, :plugins, :filters, :"13b993dad40c9caa27e54aca1198e469ec2ac399bb689d597e6d893d1498adf6", :flow]`
[2024-12-17T14:18:38,137][DEBUG][org.logstash.execution.AbstractPipelineExt] Flow metric registered: `worker_utilization` in namespace `[:stats, :pipelines, :as, :plugins, :filters, :"13b993dad40c9caa27e54aca1198e469ec2ac399bb689d597e6d893d1498adf6", :flow]`
[2024-12-17T14:18:38,138][DEBUG][org.logstash.execution.AbstractPipelineExt] Flow metric registered: `worker_millis_per_event` in namespace `[:stats, :pipelines, :as, :plugins, :filters, :"41dd3c63fed9b098f7c2a8bcdcb1b452e3735a816ef9b9e3d31ef23a7b364ac8", :flow]`
[2024-12-17T14:18:38,138][DEBUG][org.logstash.execution.AbstractPipelineExt] Flow metric registered: `worker_utilization` in namespace `[:stats, :pipelines, :as, :plugins, :filters, :"41dd3c63fed9b098f7c2a8bcdcb1b452e3735a816ef9b9e3d31ef23a7b364ac8", :flow]`
[2024-12-17T14:18:38,136][DEBUG][org.logstash.execution.AbstractPipelineExt] Flow metric registered: `worker_utilization` in namespace `[:stats, :pipelines, :frontend, :plugins, :filters, :ff6dd75e5d378a40759ed811a4d5c889fa856511e20b236af9cf0f8b10d005a6, :flow]`
[2024-12-17T14:18:38,139][DEBUG][org.logstash.execution.AbstractPipelineExt] Flow metric registered: `worker_millis_per_event` in namespace `[:stats, :pipelines, :frontend, :plugins, :filters, :bb2882250bbd9cc0b1a0d3a2da0c806cfa7206a575dab1768fe9f1b75727669f, :flow]`
[2024-12-17T14:18:38,141][DEBUG][org.logstash.execution.AbstractPipelineExt] Flow metric registered: `worker_millis_per_event` in namespace `[:stats, :pipelines, :as, :plugins, :filters, :c710ca26c7660c609ca8d46345d40856200387f0e74f4b48d2b2452c57783007, :flow]`
[2024-12-17T14:18:38,141][DEBUG][org.logstash.execution.AbstractPipelineExt] Flow metric registered: `worker_utilization` in namespace `[:stats, :pipelines, :as, :plugins, :filters, :c710ca26c7660c609ca8d46345d40856200387f0e74f4b48d2b2452c57783007, :flow]`
[2024-12-17T14:18:38,141][DEBUG][org.logstash.execution.AbstractPipelineExt] Flow metric registered: `worker_millis_per_event` in namespace `[:stats, :pipelines, :as, :plugins, :outputs, :fa1993f75b6a7e284aa968562976e73750589ce586ab15d1be37ef7ed5b79db1, :flow]`
[2024-12-17T14:18:38,142][DEBUG][org.logstash.execution.AbstractPipelineExt] Flow metric registered: `worker_utilization` in namespace `[:stats, :pipelines, :frontend, :plugins, :filters, :bb2882250bbd9cc0b1a0d3a2da0c806cfa7206a575dab1768fe9f1b75727669f, :flow]`
[2024-12-17T14:18:38,147][DEBUG][org.logstash.execution.AbstractPipelineExt] Flow metric registered: `worker_utilization` in namespace `[:stats, :pipelines, :as, :plugins, :outputs, :fa1993f75b6a7e284aa968562976e73750589ce586ab15d1be37ef7ed5b79db1, :flow]`
[2024-12-17T14:18:38,152][DEBUG][org.logstash.execution.AbstractPipelineExt] Flow metric registered: `worker_millis_per_event` in namespace `[:stats, :pipelines, :frontend, :plugins, :filters, :"4fc395034e01f08a2f709d5f22aed45f91d38c52cf60fec37cf1db842f2f7d75", :flow]`
[2024-12-17T14:18:38,154][DEBUG][logstash.javapipeline    ] Starting pipeline {:pipeline_id=>"as"}
[2024-12-17T14:18:38,154][DEBUG][org.logstash.execution.AbstractPipelineExt] Flow metric registered: `worker_utilization` in namespace `[:stats, :pipelines, :frontend, :plugins, :filters, :"4fc395034e01f08a2f709d5f22aed45f91d38c52cf60fec37cf1db842f2f7d75", :flow]`
[2024-12-17T14:18:38,155][DEBUG][org.logstash.execution.AbstractPipelineExt] Flow metric registered: `worker_millis_per_event` in namespace `[:stats, :pipelines, :frontend, :plugins, :filters, :"5aecdccda8a932305939895ed63849d9e18f48737452490b487988c5d9df5bb7", :flow]`
[2024-12-17T14:18:38,156][DEBUG][org.logstash.execution.AbstractPipelineExt] Flow metric registered: `worker_utilization` in namespace `[:stats, :pipelines, :frontend, :plugins, :filters, :"5aecdccda8a932305939895ed63849d9e18f48737452490b487988c5d9df5bb7", :flow]`
[2024-12-17T14:18:38,156][DEBUG][org.logstash.execution.AbstractPipelineExt] Flow metric registered: `worker_millis_per_event` in namespace `[:stats, :pipelines, :frontend, :plugins, :filters, :"6b8620ee95ff96fc74ee073bfa28535a88187b289844b30cc2c333a6f15c5660", :flow]`
[2024-12-17T14:18:38,157][DEBUG][org.logstash.execution.AbstractPipelineExt] Flow metric registered: `worker_utilization` in namespace `[:stats, :pipelines, :frontend, :plugins, :filters, :"6b8620ee95ff96fc74ee073bfa28535a88187b289844b30cc2c333a6f15c5660", :flow]`
[2024-12-17T14:18:38,158][DEBUG][org.logstash.execution.AbstractPipelineExt] Flow metric registered: `worker_millis_per_event` in namespace `[:stats, :pipelines, :frontend, :plugins, :filters, :"9f8e8c69cb9b460f1042a3a55e55f588e11c7a8d1f295dba6801347a2c43d7eb", :flow]`
[2024-12-17T14:18:38,158][DEBUG][org.logstash.execution.AbstractPipelineExt] Flow metric registered: `worker_utilization` in namespace `[:stats, :pipelines, :frontend, :plugins, :filters, :"9f8e8c69cb9b460f1042a3a55e55f588e11c7a8d1f295dba6801347a2c43d7eb", :flow]`
[2024-12-17T14:18:38,159][DEBUG][org.logstash.execution.AbstractPipelineExt] Flow metric registered: `worker_millis_per_event` in namespace `[:stats, :pipelines, :frontend, :plugins, :filters, :"70a1e5968d7b74097334aa311bf2d2c9466ced29405c613735614c573f8d34b8", :flow]`
[2024-12-17T14:18:38,159][DEBUG][org.logstash.execution.AbstractPipelineExt] Flow metric registered: `worker_utilization` in namespace `[:stats, :pipelines, :frontend, :plugins, :filters, :"70a1e5968d7b74097334aa311bf2d2c9466ced29405c613735614c573f8d34b8", :flow]`
[2024-12-17T14:18:38,160][DEBUG][org.logstash.execution.AbstractPipelineExt] Flow metric registered: `worker_millis_per_event` in namespace `[:stats, :pipelines, :frontend, :plugins, :filters, :bca3b140291994bed57ee21cb92adf5883254e9a51a1318976d9462555ea48e6, :flow]`
[2024-12-17T14:18:38,162][DEBUG][org.logstash.execution.AbstractPipelineExt] Flow metric registered: `worker_utilization` in namespace `[:stats, :pipelines, :frontend, :plugins, :filters, :bca3b140291994bed57ee21cb92adf5883254e9a51a1318976d9462555ea48e6, :flow]`
[2024-12-17T14:18:38,164][DEBUG][org.logstash.execution.AbstractPipelineExt] Flow metric registered: `worker_millis_per_event` in namespace `[:stats, :pipelines, :frontend, :plugins, :filters, :f2e5b5935d874c877c1addc1a5bfea9d9df10e829507a7cc94eaa2b1e72573f0, :flow]`
[2024-12-17T14:18:38,164][DEBUG][org.logstash.execution.AbstractPipelineExt] Flow metric registered: `worker_utilization` in namespace `[:stats, :pipelines, :frontend, :plugins, :filters, :f2e5b5935d874c877c1addc1a5bfea9d9df10e829507a7cc94eaa2b1e72573f0, :flow]`
[2024-12-17T14:18:38,165][DEBUG][org.logstash.execution.AbstractPipelineExt] Flow metric registered: `worker_millis_per_event` in namespace `[:stats, :pipelines, :frontend, :plugins, :filters, :c139b7bb0274cdd7e5efd9ba510bee676214a3cf5fdce42ab9e4302998c5a829, :flow]`
[2024-12-17T14:18:38,166][DEBUG][org.logstash.execution.AbstractPipelineExt] Flow metric registered: `worker_utilization` in namespace `[:stats, :pipelines, :frontend, :plugins, :filters, :c139b7bb0274cdd7e5efd9ba510bee676214a3cf5fdce42ab9e4302998c5a829, :flow]`
[2024-12-17T14:18:38,166][DEBUG][org.logstash.execution.AbstractPipelineExt] Flow metric registered: `worker_millis_per_event` in namespace `[:stats, :pipelines, :frontend, :plugins, :outputs, :b298ba3a6dd335a899b678a379ec822dd89a2d379bd4c5470213e8cc7d08f445, :flow]`
[2024-12-17T14:18:38,167][DEBUG][org.logstash.execution.AbstractPipelineExt] Flow metric registered: `worker_utilization` in namespace `[:stats, :pipelines, :frontend, :plugins, :outputs, :b298ba3a6dd335a899b678a379ec822dd89a2d379bd4c5470213e8cc7d08f445, :flow]`
[2024-12-17T14:18:38,167][DEBUG][logstash.javapipeline    ] Starting pipeline {:pipeline_id=>"frontend"}
[2024-12-17T14:18:38,559][ERROR][logstash.agent           ] Failed to execute action {:id=>:frontend, :action_type=>LogStash::ConvergeResult::FailedAction, :message=>"Could not execute action: PipelineAction::Create<frontend>, action_result: false", :backtrace=>nil}
[2024-12-17T14:18:38,560][ERROR][logstash.agent           ] Failed to execute action {:id=>:as, :action_type=>LogStash::ConvergeResult::FailedAction, :message=>"Could not execute action: PipelineAction::Create<as>, action_result: false", :backtrace=>nil}
[2024-12-17T14:18:39,216][DEBUG][logstash.javapipeline    ] Pipeline started successfully {:pipeline_id=>"zabbix", :thread=>"#<Thread:0x67dcb70a /usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:139 run>"}

Can you share more lines and also share one of your pipelines?

Config from one of the pipelines that use this plugin (translate configuration commented so I could start the pipeline again)

input {
        tcp {
        port => 4301
        codec => multiline {
                pattern => "^\{ "
                negate => true
                what => "previous"
        }
        ssl_client_authentication => none
        tcp_keep_alive => true
        }
}

filter {
        mutate {
        gsub => [
                "message", "\r", "\\r",
                "message", "\n", "",
                "message", "/", "\/"
        ]

        }

        split {
                field => "message"
        }

        if [message] =~ /^\{.*\}$/ {
                json {
                        source => "message"
                        target => "json"
                        remove_field => ["message"]  # Optionally remove the original raw JSON field
                }

                mutate {
                        rename => {
                                "[json][TimeStamp]" => "time"
                                "[json][level]" => "log_level"
                                "[json][machine]" => "machine"
                                "[json][Message]" => "message"
                                "[json][Exception]" => "exception"
                                "[json][Stacktrace]" => "stacktrace"
                                "[json][Thread]" => "thread_id"
                                "[json][ExceptionType]" => "exception_type"
                                "[json][WEBSITE_SITE_NAME]" => "site_name"
                                "[json][ClientId]" => "client_id"
                                "[json][EnvironmentId]" => "environment_id"
                        }
                }

#               translate {
#                       source => "environment_id"
#                       target => "environment"
#                       dictionary => {
#                               "0" => "Production"
#                               "1" => "Test"
#                               "2" => "UAT"
#                               "4" => "Dev"
#                       }
#                       fallback => "N/A"
#               }
#
#               translate {
#                       source => "client_id"
#                       target => "client"
#                       dictionary_path => "/opt/logstash/clientid.json"
#                       fallback => "N/A"
#               }


        } else {
                mutate {
                        add_field => { "type" => "plaintext_log" }
                }

                grok {
                        match => { "message" => "%{GREEDYDATA:raw_message}" }
                        overwrite => ["message"]
                }
        }


        mutate {
                remove_field => ["sort","[event][original]","@version","tags"]
        }
}

output {
        elasticsearch {
                hosts => ["<ELASTICSEARCH SERVER>"]
                ssl_verification_mode => "none"
                user => "<ELASTICSEARCH USER>"
                password => "<PASSWORD USER>"
                index => "frontend-logs-%{+YYYY.MM.dd}"
        }

}


There is nothing useful on those logs.

Could you disable DEBUG, start it again to generate the errors and share the logs from starting to end?

I could not replicate, I've uncommented the first translate and your pipeline run normally on 8.17.

Hey there!
Sorry took so long to answer...

Could you disable DEBUG, start it again to generate the errors and share the logs from starting to end?

[2024-12-17T13:52:48,477][INFO ][logstash.runner          ] Log4j configuration path used is: /etc/logstash/log4j2.properties
[2024-12-17T13:52:48,484][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"8.16.1", "jruby.version"=>"jruby 9.4.9.0 (3.1.4) 2024-11-04 547c6b150e OpenJDK 64-Bit Server VM 21.0.5+11-LTS on 21.0.5+11-LTS +indy +jit [x86_64-linux]"}
[2024-12-17T13:52:48,486][INFO ][logstash.runner          ] JVM bootstrap flags: [-Xms4g, -Xmx4g, -Djava.awt.headless=true, -Dfile.encoding=UTF-8, -Djruby.compile.invokedynamic=true, -XX:+HeapDumpOnOutOfMemoryError, -Djava.security.egd=file:/dev/urandom, -Dlog4j2.isThreadContextMapInheritable=true, -Dlogstash.jackson.stream-read-constraints.max-string-length=200000000, -Dlogstash.jackson.stream-read-constraints.max-number-length=10000, -Djruby.regexp.interruptible=true, -Djdk.io.File.enableADS=true, --add-exports=jdk.compiler/com.sun.tools.javac.api=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.file=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.parser=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.tree=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.util=ALL-UNNAMED, --add-opens=java.base/java.security=ALL-UNNAMED, --add-opens=java.base/java.io=ALL-UNNAMED, --add-opens=java.base/java.nio.channels=ALL-UNNAMED, --add-opens=java.base/sun.nio.ch=ALL-UNNAMED, --add-opens=java.management/sun.management=ALL-UNNAMED, -Dio.netty.allocator.maxOrder=11]
[2024-12-17T13:52:48,488][INFO ][logstash.runner          ] Jackson default value override `logstash.jackson.stream-read-constraints.max-string-length` configured to `200000000`
[2024-12-17T13:52:48,488][INFO ][logstash.runner          ] Jackson default value override `logstash.jackson.stream-read-constraints.max-number-length` configured to `10000`
[2024-12-17T13:52:49,075][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600, :ssl_enabled=>false}
[2024-12-17T13:52:50,102][INFO ][org.reflections.Reflections] Reflections took 105 ms to scan 1 urls, producing 149 keys and 523 values
[2024-12-17T13:52:50,957][INFO ][logstash.javapipeline    ] Pipeline `frontend` is configured with `pipeline.ecs_compatibility: v8` setting. All plugins in this pipeline will default to `ecs_compatibility => v8` unless explicitly configured otherwise.
[2024-12-17T13:52:50,958][INFO ][logstash.javapipeline    ] Pipeline `as` is configured with `pipeline.ecs_compatibility: v8` setting. All plugins in this pipeline will default to `ecs_compatibility => v8` unless explicitly configured otherwise.
[2024-12-17T13:52:50,999][INFO ][logstash.javapipeline    ] Pipeline `zabbix` is configured with `pipeline.ecs_compatibility: v8` setting. All plugins in this pipeline will default to `ecs_compatibility => v8` unless explicitly configured otherwise.
[2024-12-17T13:52:51,291][ERROR][logstash.agent           ] Failed to execute action {:id=>:frontend, :action_type=>LogStash::ConvergeResult::FailedAction, :message=>"Could not execute action: PipelineAction::Create<frontend>, action_result: false", :backtrace=>nil}
[2024-12-17T13:52:51,291][ERROR][logstash.agent           ] Failed to execute action {:id=>:as, :action_type=>LogStash::ConvergeResult::FailedAction, :message=>"Could not execute action: PipelineAction::Create<as>, action_result: false", :backtrace=>nil}
[2024-12-17T13:53:57,834][WARN ][logstash.runner          ] SIGTERM received. Shutting down.
[2024-12-17T13:54:02,856][INFO ][logstash.pipelinesregistry] Removed pipeline from registry successfully {:pipeline_id=>:zabbix}
[2024-12-17T13:54:02,862][INFO ][logstash.runner          ] Logstash shut down.
[2024-12-17T13:54:12,709][INFO ][logstash.runner          ] Log4j configuration path used is: /etc/logstash/log4j2.properties
[2024-12-17T13:54:12,715][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"8.16.1", "jruby.version"=>"jruby 9.4.9.0 (3.1.4) 2024-11-04 547c6b150e OpenJDK 64-Bit Server VM 21.0.5+11-LTS on 21.0.5+11-LTS +indy +jit [x86_64-linux]"}
[2024-12-17T13:54:12,717][INFO ][logstash.runner          ] JVM bootstrap flags: [-Xms4g, -Xmx4g, -Djava.awt.headless=true, -Dfile.encoding=UTF-8, -Djruby.compile.invokedynamic=true, -XX:+HeapDumpOnOutOfMemoryError, -Djava.security.egd=file:/dev/urandom, -Dlog4j2.isThreadContextMapInheritable=true, -Dlogstash.jackson.stream-read-constraints.max-string-length=200000000, -Dlogstash.jackson.stream-read-constraints.max-number-length=10000, -Djruby.regexp.interruptible=true, -Djdk.io.File.enableADS=true, --add-exports=jdk.compiler/com.sun.tools.javac.api=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.file=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.parser=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.tree=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.util=ALL-UNNAMED, --add-opens=java.base/java.security=ALL-UNNAMED, --add-opens=java.base/java.io=ALL-UNNAMED, --add-opens=java.base/java.nio.channels=ALL-UNNAMED, --add-opens=java.base/sun.nio.ch=ALL-UNNAMED, --add-opens=java.management/sun.management=ALL-UNNAMED, -Dio.netty.allocator.maxOrder=11]
[2024-12-17T13:54:12,718][INFO ][logstash.runner          ] Jackson default value override `logstash.jackson.stream-read-constraints.max-string-length` configured to `200000000`
[2024-12-17T13:54:12,719][INFO ][logstash.runner          ] Jackson default value override `logstash.jackson.stream-read-constraints.max-number-length` configured to `10000`
[2024-12-17T13:54:13,241][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600, :ssl_enabled=>false}
[2024-12-17T13:54:14,178][INFO ][org.reflections.Reflections] Reflections took 116 ms to scan 1 urls, producing 149 keys and 523 values
[2024-12-17T13:54:15,023][INFO ][logstash.javapipeline    ] Pipeline `frontend` is configured with `pipeline.ecs_compatibility: v8` setting. All plugins in this pipeline will default to `ecs_compatibility => v8` unless explicitly configured otherwise.
[2024-12-17T13:54:15,027][INFO ][logstash.javapipeline    ] Pipeline `as` is configured with `pipeline.ecs_compatibility: v8` setting. All plugins in this pipeline will default to `ecs_compatibility => v8` unless explicitly configured otherwise.
[2024-12-17T13:54:15,063][INFO ][logstash.javapipeline    ] Pipeline `zabbix` is configured with `pipeline.ecs_compatibility: v8` setting. All plugins in this pipeline will default to `ecs_compatibility => v8` unless explicitly configured otherwise.
[2024-12-17T13:54:15,366][ERROR][logstash.agent           ] Failed to execute action {:id=>:frontend, :action_type=>LogStash::ConvergeResult::FailedAction, :message=>"Could not execute action: PipelineAction::Create<frontend>, action_result: false", :backtrace=>nil}
[2024-12-17T13:54:15,393][ERROR][logstash.agent           ] Failed to execute action {:id=>:as, :action_type=>LogStash::ConvergeResult::FailedAction, :message=>"Could not execute action: PipelineAction::Create<as>, action_result: false", :backtrace=>nil}

I dont have logs without debug after the update, and as is a production environment, cannot re enable the translate plugin for now...

Maybe the version of one of the libs I use?
I use Debian 12, could that be a factor?

Thank you for the help

I don't think so, Logstash is a java application and uses a bundled java.

Are you using another java? But even if it was, not sure what could case this.

As mentioned, I could not replicate, your pipeline starts for me.

I think you should open a bug report in Github, but I could not think of what could cause this, there was no recente change in the translate plugin, the last version is June 2023.