Logstash 8.12.2 - CSV Input to HTTP Output - Pipeline stuck looping

I have a logstash instance I am attempting to troubleshoot, where when I launch Logstash in debug mode, it looks like logstash is looping on no cgroup found. I've looked up and followed advice on swapping the sincedb path + the logstash instance works fine when just outputting to rubydebug. I have tested the endpoint using non CSV input and Logstash operates fine. Does anyone have a recommended troubleshooting next step for me to execute on?

Config:

input {
    file {
        path => "/Users/{obfuscated}/test_data/{obfuscated}3.csv"
        start_position => "beginning"
        sincedb_path => "/dev/null"
    }
}

filter { 
  csv {
    autodetect_column_names => true
  }

  mutate {
  	rename => { "Alert name" => "title"}
  	rename => { "Tags" => "tags"}
		rename => { "Severity" => "severity"}
		rename => { "Investigation state" => "investigationState"}
		rename => { "Status" => "status"}
		rename => { "Category" => "category"}
		rename => { "Detection source" => "detectionSource"}
		rename => { "Impacted assets" => "impactedAssets"}
		rename => { "First activity" => "firstActivity"}
		rename => { "Last activity" => "lastActivity"}
		rename => { "Classification" => "classification"}
		rename => { "Determination" => "determination"}
		rename => { "Assignee" => "assignedTo"}
		
		split => ["impactedAssets", ","]

		copy => {"[impactedAssets][0]" => "deviceName"}
  	copy => {"[impactedAssets][1]" => "userName"}

		remove_field => ["message", "[event][original]"]
  }

if ![userName] {
 	drop {}
}

  mutate {
  	gsub => ["userName", "Accounts: ", ""]
  	gsub => ["deviceName", "Devices: ", ""]
  }
}

output {
  stdout {
  codec => rubydebug
  }
  http { 
  id => "_push_MDEAlerts" 
  url => "{API URL}" 
  http_method => "post" 
  headers => { 
  "x-api-key" => "token obfuscated" 
  "content-type" => "application/json"
}
 format => "json_batch" # aka JSON array/list 
 request_timeout => 600
 retry_failed => false 
 pool_max => 1
 }
}

The Debug loop contains the below repeated log messaging, which contains no errors. When I try --verbose, the logs don't populate at all.

[2025-03-24T14:39:45,382][INFO ][logstash.javapipeline    ] Pipeline `main` is configured with `pipeline.ecs_compatibility: v8` setting. All plugins in this pipeline will default to `ecs_compatibility => v8` unless explicitly configured otherwise.
[2025-03-24T14:39:45,396][DEBUG][org.logstash.execution.AbstractPipelineExt] Flow metric registered: `input_throughput` in namespace `[:stats, :pipelines, :main, :flow]`
[2025-03-24T14:39:45,413][DEBUG][org.logstash.execution.AbstractPipelineExt] Flow metric registered: `filter_throughput` in namespace `[:stats, :pipelines, :main, :flow]`
[2025-03-24T14:39:45,414][DEBUG][org.logstash.execution.AbstractPipelineExt] Flow metric registered: `output_throughput` in namespace `[:stats, :pipelines, :main, :flow]`
[2025-03-24T14:39:45,414][DEBUG][org.logstash.execution.AbstractPipelineExt] Flow metric registered: `queue_backpressure` in namespace `[:stats, :pipelines, :main, :flow]`
[2025-03-24T14:39:45,415][DEBUG][org.logstash.execution.AbstractPipelineExt] Flow metric registered: `worker_concurrency` in namespace `[:stats, :pipelines, :main, :flow]`
[2025-03-24T14:39:45,416][DEBUG][org.logstash.execution.AbstractPipelineExt] Flow metric registered: `throughput` in namespace `[:stats, :pipelines, :main, :plugins, :inputs, :deb904bd9fa09d2295923bd5391954b7a06c183a89239c0b6ebdba41737260d3, :flow]`
[2025-03-24T14:39:45,416][DEBUG][org.logstash.execution.AbstractPipelineExt] Flow metric registered: `worker_millis_per_event` in namespace `[:stats, :pipelines, :main, :plugins, :filters, :db912b0d186f2959c531eac4eeebd96036b943d0b5aaa0caa0bfb24f01a27ffc, :flow]`
[2025-03-24T14:39:45,417][DEBUG][org.logstash.execution.AbstractPipelineExt] Flow metric registered: `worker_utilization` in namespace `[:stats, :pipelines, :main, :plugins, :filters, :db912b0d186f2959c531eac4eeebd96036b943d0b5aaa0caa0bfb24f01a27ffc, :flow]`
[2025-03-24T14:39:45,417][DEBUG][org.logstash.execution.AbstractPipelineExt] Flow metric registered: `worker_millis_per_event` in namespace `[:stats, :pipelines, :main, :plugins, :filters, :ecacf32c08cd6eab2af2c5b8a1edb8854ae212cb461a16db823539c99fc885ab, :flow]`
[2025-03-24T14:39:45,417][DEBUG][org.logstash.execution.AbstractPipelineExt] Flow metric registered: `worker_utilization` in namespace `[:stats, :pipelines, :main, :plugins, :filters, :ecacf32c08cd6eab2af2c5b8a1edb8854ae212cb461a16db823539c99fc885ab, :flow]`
[2025-03-24T14:39:45,418][DEBUG][org.logstash.execution.AbstractPipelineExt] Flow metric registered: `worker_millis_per_event` in namespace `[:stats, :pipelines, :main, :plugins, :filters, :"61be544029013ac21a4ac9d596e11c3f1e3fdd9f04b1a3fff48d4d2564ba67e7", :flow]`
[2025-03-24T14:39:45,418][DEBUG][org.logstash.execution.AbstractPipelineExt] Flow metric registered: `worker_utilization` in namespace `[:stats, :pipelines, :main, :plugins, :filters, :"61be544029013ac21a4ac9d596e11c3f1e3fdd9f04b1a3fff48d4d2564ba67e7", :flow]`
[2025-03-24T14:39:45,418][DEBUG][org.logstash.execution.AbstractPipelineExt] Flow metric registered: `worker_millis_per_event` in namespace `[:stats, :pipelines, :main, :plugins, :filters, :"162595caa20edb10d2625ff905bf3fbcc418ac7cc355c355796a4d9839aa6216", :flow]`
[2025-03-24T14:39:45,419][DEBUG][org.logstash.execution.AbstractPipelineExt] Flow metric registered: `worker_utilization` in namespace `[:stats, :pipelines, :main, :plugins, :filters, :"162595caa20edb10d2625ff905bf3fbcc418ac7cc355c355796a4d9839aa6216", :flow]`
[2025-03-24T14:39:45,419][DEBUG][org.logstash.execution.AbstractPipelineExt] Flow metric registered: `worker_millis_per_event` in namespace `[:stats, :pipelines, :main, :plugins, :outputs, :"4596faeb33c148caeda685457fb89fcdb50b0a5aac0a2a789df6b235095dc60c", :flow]`
[2025-03-24T14:39:45,419][DEBUG][org.logstash.execution.AbstractPipelineExt] Flow metric registered: `worker_utilization` in namespace `[:stats, :pipelines, :main, :plugins, :outputs, :"4596faeb33c148caeda685457fb89fcdb50b0a5aac0a2a789df6b235095dc60c", :flow]`
[2025-03-24T14:39:45,420][DEBUG][org.logstash.execution.AbstractPipelineExt] Flow metric registered: `worker_millis_per_event` in namespace `[:stats, :pipelines, :main, :plugins, :outputs, :unify_events_push_MDEAlerts, :flow]`
[2025-03-24T14:39:45,420][DEBUG][org.logstash.execution.AbstractPipelineExt] Flow metric registered: `worker_utilization` in namespace `[:stats, :pipelines, :main, :plugins, :outputs, :unify_events_push_MDEAlerts, :flow]`
[2025-03-24T14:39:45,420][DEBUG][logstash.javapipeline    ] Starting pipeline {:pipeline_id=>"main"}
[2025-03-24T14:39:45,424][DEBUG][logstash.filters.csv     ][main] CSV parsing options {:col_sep=>",", :quote_char=>"\""}
[2025-03-24T14:39:45,425][INFO ][logstash.filters.csv     ][main] ECS compatibility is enabled but `target` option was not specified. This may cause fields to be set at the top-level of the event where they are likely to clash with the Elastic Common Schema. It is recommended to set the `target` option to avoid potential schema conflicts (if your data is ECS compliant or non-conflicting, feel free to ignore this message)
[2025-03-24T14:39:45,438][INFO ][logstash.javapipeline    ][main] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>12, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>1500, "pipeline.sources"=>["/Users/steventunink/Desktop/logstash-8.12.2/config/csvTest.conf"], :thread=>"#<Thread:0x1d8e36e /Users/steventunink/Desktop/logstash-8.12.2/logstash-core/lib/logstash/java_pipeline.rb:134 run>"}
[2025-03-24T14:39:46,250][INFO ][logstash.javapipeline    ][main] Pipeline Java execution initialization time {"seconds"=>0.81}
[2025-03-24T14:39:48,851][DEBUG][logstash.instrument.periodicpoller.cgroup] One or more required cgroup files or directories not found: /proc/self/cgroup, /sys/fs/cgroup/cpuacct, /sys/fs/cgroup/cpu
[2025-03-24T14:39:48,896][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"G1 Young Generation"}
[2025-03-24T14:39:48,896][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"G1 Old Generation"}
[2025-03-24T14:39:51,260][INFO ][logstash.javapipeline    ][main] Pipeline started {"pipeline.id"=>"main"}
[2025-03-24T14:39:51,264][INFO ][filewatch.observingtail  ][main][deb904bd9fa09d2295923bd5391954b7a06c183a89239c0b6ebdba41737260d3] START, creating Discoverer, Watch with file and sincedb collections
[2025-03-24T14:39:51,266][DEBUG][org.logstash.execution.PeriodicFlush][main] Pushing flush onto pipeline.
[2025-03-24T14:39:51,267][DEBUG][filewatch.sincedbcollection][main][deb904bd9fa09d2295923bd5391954b7a06c183a89239c0b6ebdba41737260d3] open: reading from NUL
[2025-03-24T14:39:51,267][DEBUG][logstash.javapipeline    ] Pipeline started successfully {:pipeline_id=>"main", :thread=>"#<Thread:0x1d8e36e /Users/steventunink/Desktop/logstash-8.12.2/logstash-core/lib/logstash/java_pipeline.rb:134 run>"}
[2025-03-24T14:39:51,274][INFO ][logstash.agent           ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[2025-03-24T14:39:52,280][DEBUG][filewatch.sincedbcollection][main][deb904bd9fa09d2295923bd5391954b7a06c183a89239c0b6ebdba41737260d3] writing sincedb (delta since last write = 1742845192)
[2025-03-24T14:39:53,859][DEBUG][logstash.instrument.periodicpoller.cgroup] One or more required cgroup files or directories not found: /proc/self/cgroup, /sys/fs/cgroup/cpuacct, /sys/fs/cgroup/cpu
[2025-03-24T14:39:53,905][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"G1 Young Generation"}
[2025-03-24T14:39:53,905][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"G1 Old Generation"}
[2025-03-24T14:39:56,270][DEBUG][org.logstash.execution.PeriodicFlush][main] Pushing flush onto pipeline.
[2025-03-24T14:39:58,863][DEBUG][logstash.instrument.periodicpoller.cgroup] One or more required cgroup files or directories not found: /proc/self/cgroup, /sys/fs/cgroup/cpuacct, /sys/fs/cgroup/cpu
[2025-03-24T14:39:58,912][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"G1 Young Generation"}
[2025-03-24T14:39:58,913][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"G1 Old Generation"}
[2025-03-24T14:40:01,266][DEBUG][org.logstash.execution.PeriodicFlush][main] Pushing flush onto pipeline.
[2025-03-24T14:40:03,871][DEBUG][logstash.instrument.periodicpoller.cgroup] One or more required cgroup files or directories not found: /proc/self/cgroup, /sys/fs/cgroup/cpuacct, /sys/fs/cgroup/cpu
[2025-03-24T14:40:03,922][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"G1 Young Generation"}
[2025-03-24T14:40:03,922][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"G1 Old Generation"}
[2025-03-24T14:40:06,270][DEBUG][org.logstash.execution.PeriodicFlush][main] Pushing flush onto pipeline.
[2025-03-24T14:40:07,352][DEBUG][filewatch.sincedbcollection][main][deb904bd9fa09d2295923bd5391954b7a06c183a89239c0b6ebdba41737260d3] writing sincedb (delta since last write = 15)
[2025-03-24T14:40:08,877][DEBUG][logstash.instrument.periodicpoller.cgroup] One or more required cgroup files or directories not found: /proc/self/cgroup, /sys/fs/cgroup/cpuacct, /sys/fs/cgroup/cpu
[2025-03-24T14:40:08,928][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"G1 Young Generation"}
[2025-03-24T14:40:08,928][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"G1 Old Generation"}
[2025-03-24T14:40:11,269][DEBUG][org.logstash.execution.PeriodicFlush][main] Pushing flush onto pipeline.
[2025-03-24T14:40:13,892][DEBUG][logstash.instrument.periodicpoller.cgroup] One or more required cgroup files or directories not found: /proc/self/cgroup, /sys/fs/cgroup/cpuacct, /sys/fs/cgroup/cpu
[2025-03-24T14:40:13,962][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"G1 Young Generation"}
[2025-03-24T14:40:13,962][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"G1 Old Generation"}
[2025-03-24T14:40:16,269][DEBUG][org.logstash.execution.PeriodicFlush][main] Pushing flush onto pipeline.
[2025-03-24T14:40:18,898][DEBUG][logstash.instrument.periodicpoller.cgroup] One or more required cgroup files or directories not found: /proc/self/cgroup, /sys/fs/cgroup/cpuacct, /sys/fs/cgroup/cpu
[2025-03-24T14:40:18,970][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"G1 Young Generation"}
[2025-03-24T14:40:18,970][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"G1 Old Generation"}
[2025-03-24T14:40:21,267][DEBUG][org.logstash.execution.PeriodicFlush][main] Pushing flush onto pipeline.
[2025-03-24T14:40:22,412][DEBUG][filewatch.sincedbcollection][main][deb904bd9fa09d2295923bd5391954b7a06c183a89239c0b6ebdba41737260d3] writing sincedb (delta since last write = 15)
[2025-03-24T14:40:23,902][DEBUG][logstash.instrument.periodicpoller.cgroup] One or more required cgroup files or directories not found: /proc/self/cgroup, /sys/fs/cgroup/cpuacct, /sys/fs/cgroup/cpu
[2025-03-24T14:40:23,941][DEBUG][org.logstash.instrument.metrics.ExtendedFlowMetric] RetentionWindow{policy=current id=2015870025} forced-compaction result (captures: `3` span: `PT10.008008248S`)
[2025-03-24T14:40:23,942][DEBUG][org.logstash.instrument.metrics.ExtendedFlowMetric] RetentionWindow{policy=current id=696714807} forced-compaction result (captures: `3` span: `PT10.01035701S`)
[2025-03-24T14:40:23,942][DEBUG][org.logstash.instrument.metrics.ExtendedFlowMetric] RetentionWindow{policy=current id=562261766} forced-compaction result (captures: `3` span: `PT10.010495796S`)
[2025-03-24T14:40:23,942][DEBUG][org.logstash.instrument.metrics.ExtendedFlowMetric] RetentionWindow{policy=current id=998630254} forced-compaction result (captures: `3` span: `PT10.010637493S`)
[2025-03-24T14:40:23,942][DEBUG][org.logstash.instrument.metrics.ExtendedFlowMetric] RetentionWindow{policy=current id=348940971} forced-compaction result (captures: `3` span: `PT10.010852081S`)
[2025-03-24T14:40:23,981][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"G1 Young Generation"}
[2025-03-24T14:40:23,981][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"G1 Old Generation"}
[2025-03-24T14:40:26,267][DEBUG][org.logstash.execution.PeriodicFlush][main] Pushing flush onto pipeline.
[2025-03-24T14:40:28,906][DEBUG][logstash.instrument.periodicpoller.cgroup] One or more required cgroup files or directories not found: /proc/self/cgroup, /sys/fs/cgroup/cpuacct, /sys/fs/cgroup/cpu
[2025-03-24T14:40:28,947][DEBUG][org.logstash.instrument.metrics.ExtendedFlowMetric] RetentionWindow{policy=current id=892581988} forced-compaction result (captures: `3` span: `PT10.010195082S`)
[2025-03-24T14:40:28,947][DEBUG][org.logstash.instrument.metrics.ExtendedFlowMetric] RetentionWindow{policy=current id=180847658} forced-compaction result (captures: `3` span: `PT10.010480595S`)
[2025-03-24T14:40:28,948][DEBUG][org.logstash.instrument.metrics.ExtendedFlowMetric] RetentionWindow{policy=current id=104314797} forced-compaction result (captures: `3` span: `PT10.010598191S`)
[2025-03-24T14:40:28,948][DEBUG][org.logstash.instrument.metrics.ExtendedFlowMetric] RetentionWindow{policy=current id=1842664733} forced-compaction result (captures: `3` span: `PT10.010701591S`)
[2025-03-24T14:40:28,948][DEBUG][org.logstash.instrument.metrics.ExtendedFlowMetric] RetentionWindow{policy=current id=1158459483} forced-compaction result (captures: `3` span: `PT10.01079701S`)
[2025-03-24T14:40:28,948][DEBUG][org.logstash.instrument.metrics.ExtendedFlowMetric] RetentionWindow{policy=current id=1943833092} forced-compaction result (captures: `3` span: `PT10.010893769S`)
[2025-03-24T14:40:28,948][DEBUG][org.logstash.instrument.metrics.ExtendedFlowMetric] RetentionWindow{policy=current id=1223545101} forced-compaction result (captures: `3` span: `PT10.010986472S`)
[2025-03-24T14:40:28,948][DEBUG][org.logstash.instrument.metrics.ExtendedFlowMetric] RetentionWindow{policy=current id=56265008} forced-compaction result (captures: `3` span: `PT10.011080058S`)
[2025-03-24T14:40:28,948][DEBUG][org.logstash.instrument.metrics.ExtendedFlowMetric] RetentionWindow{policy=current id=603221864} forced-compaction result (captures: `3` span: `PT10.011176867S`)
[2025-03-24T14:40:28,948][DEBUG][org.logstash.instrument.metrics.ExtendedFlowMetric] RetentionWindow{policy=current id=888098232} forced-compaction result (captures: `3` span: `PT10.011273651S`)
[2025-03-24T14:40:28,948][DEBUG][org.logstash.instrument.metrics.ExtendedFlowMetric] RetentionWindow{policy=current id=304908177} forced-compaction result (captures: `3` span: `PT10.01136663S`)
[2025-03-24T14:40:28,949][DEBUG][org.logstash.instrument.metrics.ExtendedFlowMetric] RetentionWindow{policy=current id=1172873197} forced-compaction result (captures: `3` span: `PT10.01146527S`)
[2025-03-24T14:40:28,949][DEBUG][org.logstash.instrument.metrics.ExtendedFlowMetric] RetentionWindow{policy=current id=1172776727} forced-compaction result (captures: `3` span: `PT10.011586179S`)
[2025-03-24T14:40:28,949][DEBUG][org.logstash.instrument.metrics.ExtendedFlowMetric] RetentionWindow{policy=current id=195764382} forced-compaction result (captures: `3` span: `PT10.011690007S`)
[2025-03-24T14:40:28,949][DEBUG][org.logstash.instrument.metrics.ExtendedFlowMetric] RetentionWindow{policy=current id=2110598773} forced-compaction result (captures: `3` span: `PT10.011783694S`)
[2025-03-24T14:40:28,949][DEBUG][org.logstash.instrument.metrics.ExtendedFlowMetric] RetentionWindow{policy=current id=2086392278} forced-compaction result (captures: `3` span: `PT10.011883186S`)
[2025-03-24T14:40:28,949][DEBUG][org.logstash.instrument.metrics.ExtendedFlowMetric] RetentionWindow{policy=current id=16293589} forced-compaction result (captures: `3` span: `PT10.011980253S`)
[2025-03-24T14:40:28,949][DEBUG][org.logstash.instrument.metrics.ExtendedFlowMetric] RetentionWindow{policy=current id=1384505732} forced-compaction result (captures: `3` span: `PT10.012072112S`)
[2025-03-24T14:40:28,992][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"G1 Young Generation"}
[2025-03-24T14:40:28,993][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"G1 Old Generation"}
[2025-03-24T14:40:31,268][DEBUG][org.logstash.execution.PeriodicFlush][main] Pushing flush onto pipeline.
[2025-03-24T14:40:33,910][DEBUG][logstash.instrument.periodicpoller.cgroup] One or more required cgroup files or directories not found: /proc/self/cgroup, /sys/fs/cgroup/cpuacct, /sys/fs/cgroup/cpu
[2025-03-24T14:40:34,006][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"G1 Young Generation"}
[2025-03-24T14:40:34,007][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"G1 Old Generation"}
[2025-03-24T14:40:36,269][DEBUG][org.logstash.execution.PeriodicFlush][main] Pushing flush onto pipeline.
[2025-03-24T14:40:37,472][DEBUG][filewatch.sincedbcollection][main][deb904bd9fa09d2295923bd5391954b7a06c183a89239c0b6ebdba41737260d3] writing sincedb (delta since last write = 15)
[2025-03-24T14:40:38,922][DEBUG][logstash.instrument.periodicpoller.cgroup] One or more required cgroup files or directories not found: /proc/self/cgroup, /sys/fs/cgroup/cpuacct, /sys/fs/cgroup/cpu
[2025-03-24T14:40:39,024][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"G1 Young Generation"}
[2025-03-24T14:40:39,024][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"G1 Old Generation"}
[2025-03-24T14:40:41,267][DEBUG][org.logstash.execution.PeriodicFlush][main] Pushing flush onto pipeline.
[2025-03-24T14:40:43,931][DEBUG][logstash.instrument.periodicpoller.cgroup] One or more required cgroup files or directories not found: /proc/self/cgroup, /sys/fs/cgroup/cpuacct, /sys/fs/cgroup/cpu
[2025-03-24T14:40:44,034][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"G1 Young Generation"}
[2025-03-24T14:40:44,035][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"G1 Old Generation"}
[2025-03-24T14:40:46,271][DEBUG][org.logstash.execution.PeriodicFlush][main] Pushing flush onto pipeline.
[2025-03-24T14:40:48,936][DEBUG][logstash.instrument.periodicpoller.cgroup] One or more required cgroup files or directories not found: /proc/self/cgroup, /sys/fs/cgroup/cpuacct, /sys/fs/cgroup/cpu
[2025-03-24T14:40:49,044][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"G1 Young Generation"}
[2025-03-24T14:40:49,044][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"G1 Old Generation"}
[2025-03-24T14:40:51,271][DEBUG][org.logstash.execution.PeriodicFlush][main] Pushing flush onto pipeline.
[2025-03-24T14:40:52,554][DEBUG][filewatch.sincedbcollection][main][deb904bd9fa09d2295923bd5391954b7a06c183a89239c0b6ebdba41737260d3] writing sincedb (delta since last write = 15)
[2025-03-24T14:40:53,942][DEBUG][logstash.instrument.periodicpoller.cgroup] One or more required cgroup files or directories not found: /proc/self/cgroup, /sys/fs/cgroup/cpuacct, /sys/fs/cgroup/cpu
[2025-03-24T14:40:53,981][DEBUG][org.logstash.instrument.metrics.ExtendedFlowMetric] RetentionWindow{policy=current id=2015870025} forced-compaction result (captures: `3` span: `PT10.007002298S`)
[2025-03-24T14:40:53,981][DEBUG][org.logstash.instrument.metrics.ExtendedFlowMetric] RetentionWindow{policy=current id=696714807} forced-compaction result (captures: `3` span: `PT10.0072222S`)
[2025-03-24T14:40:53,981][DEBUG][org.logstash.instrument.metrics.ExtendedFlowMetric] RetentionWindow{policy=current id=562261766} forced-compaction result (captures: `3` span: `PT10.007339053S`)
[2025-03-24T14:40:53,981][DEBUG][org.logstash.instrument.metrics.ExtendedFlowMetric] RetentionWindow{policy=current id=998630254} forced-compaction result (captures: `3` span: `PT10.007431042S`)
[2025-03-24T14:40:53,982][DEBUG][org.logstash.instrument.metrics.ExtendedFlowMetric] RetentionWindow{policy=current id=348940971} forced-compaction result (captures: `3` span: `PT10.007531342S`)
[2025-03-24T14:40:54,052][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"G1 Young Generation"}
[2025-03-24T14:40:54,052][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"G1 Old Generation"}
[2025-03-24T14:40:56,268][DEBUG][org.logstash.execution.PeriodicFlush][main] Pushing flush onto pipeline.
[2025-03-24T14:40:58,948][DEBUG][logstash.instrument.periodicpoller.cgroup] One or more required cgroup files or directories not found: /proc/self/cgroup, /sys/fs/cgroup/cpuacct, /sys/fs/cgroup/cpu
[2025-03-24T14:40:58,988][DEBUG][org.logstash.instrument.metrics.ExtendedFlowMetric] RetentionWindow{policy=current id=892581988} forced-compaction result (captures: `3` span: `PT10.008473907S`)
[2025-03-24T14:40:58,988][DEBUG][org.logstash.instrument.metrics.ExtendedFlowMetric] RetentionWindow{policy=current id=180847658} forced-compaction result (captures: `3` span: `PT10.008809066S`)
[2025-03-24T14:40:58,988][DEBUG][org.logstash.instrument.metrics.ExtendedFlowMetric] RetentionWindow{policy=current id=104314797} forced-compaction result (captures: `3` span: `PT10.008963733S`)
[2025-03-24T14:40:58,988][DEBUG][org.logstash.instrument.metrics.ExtendedFlowMetric] RetentionWindow{policy=current id=1842664733} forced-compaction result (captures: `3` span: `PT10.009158074S`)
[2025-03-24T14:40:58,988][DEBUG][org.logstash.instrument.metrics.ExtendedFlowMetric] RetentionWindow{policy=current id=1158459483} forced-compaction result (captures: `3` span: `PT10.009279354S`)
[2025-03-24T14:40:58,989][DEBUG][org.logstash.instrument.metrics.ExtendedFlowMetric] RetentionWindow{policy=current id=1943833092} forced-compaction result (captures: `3` span: `PT10.009386751S`)
[2025-03-24T14:40:58,989][DEBUG][org.logstash.instrument.metrics.ExtendedFlowMetric] RetentionWindow{policy=current id=1223545101} forced-compaction result (captures: `3` span: `PT10.009487446S`)
[2025-03-24T14:40:58,989][DEBUG][org.logstash.instrument.metrics.ExtendedFlowMetric] RetentionWindow{policy=current id=56265008} forced-compaction result (captures: `3` span: `PT10.009578806S`)
[2025-03-24T14:40:58,989][DEBUG][org.logstash.instrument.metrics.ExtendedFlowMetric] RetentionWindow{policy=current id=603221864} forced-compaction result (captures: `3` span: `PT10.00966904S`)
[2025-03-24T14:40:58,989][DEBUG][org.logstash.instrument.metrics.ExtendedFlowMetric] RetentionWindow{policy=current id=888098232} forced-compaction result (captures: `3` span: `PT10.009802239S`)
[2025-03-24T14:40:58,989][DEBUG][org.logstash.instrument.metrics.ExtendedFlowMetric] RetentionWindow{policy=current id=304908177} forced-compaction result (captures: `3` span: `PT10.009895308S`)
[2025-03-24T14:40:58,989][DEBUG][org.logstash.instrument.metrics.ExtendedFlowMetric] RetentionWindow{policy=current id=1172873197} forced-compaction result (captures: `3` span: `PT10.010133155S`)
[2025-03-24T14:40:58,989][DEBUG][org.logstash.instrument.metrics.ExtendedFlowMetric] RetentionWindow{policy=current id=1172776727} forced-compaction result (captures: `3` span: `PT10.010236283S`)
[2025-03-24T14:40:58,990][DEBUG][org.logstash.instrument.metrics.ExtendedFlowMetric] RetentionWindow{policy=current id=195764382} forced-compaction result (captures: `3` span: `PT10.010317099S`)
[2025-03-24T14:40:58,990][DEBUG][org.logstash.instrument.metrics.ExtendedFlowMetric] RetentionWindow{policy=current id=2110598773} forced-compaction result (captures: `3` span: `PT10.010443536S`)
[2025-03-24T14:40:58,990][DEBUG][org.logstash.instrument.metrics.ExtendedFlowMetric] RetentionWindow{policy=current id=2086392278} forced-compaction result (captures: `3` span: `PT10.010639611S`)
[2025-03-24T14:40:58,990][DEBUG][org.logstash.instrument.metrics.ExtendedFlowMetric] RetentionWindow{policy=current id=16293589} forced-compaction result (captures: `3` span: `PT10.010776929S`)
[2025-03-24T14:40:58,990][DEBUG][org.logstash.instrument.metrics.ExtendedFlowMetric] RetentionWindow{policy=current id=1384505732} forced-compaction result (captures: `3` span: `PT10.010879315S`)
[2025-03-24T14:40:59,064][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"G1 Young Generation"}
[2025-03-24T14:40:59,065][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"G1 Old Generation"}
[2025-03-24T14:41:01,267][DEBUG][org.logstash.execution.PeriodicFlush][main] Pushing flush onto pipeline.
[2025-03-24T14:41:03,951][DEBUG][logstash.instrument.periodicpoller.cgroup] One or more required cgroup files or directories not found: /proc/self/cgroup, /sys/fs/cgroup/cpuacct, /sys/fs/cgroup/cpu
[2025-03-24T14:41:04,070][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"G1 Young Generation"}
[2025-03-24T14:41:04,071][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"G1 Old Generation"}
[2025-03-24T14:41:06,268][DEBUG][org.logstash.execution.PeriodicFlush][main] Pushing flush onto pipeline.
[2025-03-24T14:41:07,625][DEBUG][filewatch.sincedbcollection][main][deb904bd9fa09d2295923bd5391954b7a06c183a89239c0b6ebdba41737260d3] writing sincedb (delta since last write = 15)
[2025-03-24T14:41:08,956][DEBUG][logstash.instrument.periodicpoller.cgroup] One or more required cgroup files or directories not found: /proc/self/cgroup, /sys/fs/cgroup/cpuacct, /sys/fs/cgroup/cpu
[2025-03-24T14:41:09,081][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"G1 Young Generation"}
[2025-03-24T14:41:09,081][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"G1 Old Generation"}

Hello and welcome,

Debut mode is rarely needed and sometimes can add more confusion than other things.

You can disable debug and enable it if asked.

Ignoring the debug lines, there are no issues in the log you shared:

[2025-03-24T14:39:45,382][INFO ][logstash.javapipeline    ] Pipeline `main` is configured with `pipeline.ecs_compatibility: v8` setting. All plugins in this pipeline will default to `ecs_compatibility => v8` unless explicitly configured otherwise.
[2025-03-24T14:39:45,425][INFO ][logstash.filters.csv     ][main] ECS compatibility is enabled but `target` option was not specified. This may cause fields to be set at the top-level of the event where they are likely to clash with the Elastic Common Schema. It is recommended to set the `target` option to avoid potential schema conflicts (if your data is ECS compliant or non-conflicting, feel free to ignore this message)
[2025-03-24T14:39:45,438][INFO ][logstash.javapipeline    ][main] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>12, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>1500, "pipeline.sources"=>["/Users/steventunink/Desktop/logstash-8.12.2/config/csvTest.conf"], :thread=>"#<Thread:0x1d8e36e /Users/steventunink/Desktop/logstash-8.12.2/logstash-core/lib/logstash/java_pipeline.rb:134 run>"}
[2025-03-24T14:39:46,250][INFO ][logstash.javapipeline    ][main] Pipeline Java execution initialization time {"seconds"=>0.81}
[2025-03-24T14:39:51,260][INFO ][logstash.javapipeline    ][main] Pipeline started {"pipeline.id"=>"main"}
[2025-03-24T14:39:51,264][INFO ][filewatch.observingtail  ][main][deb904bd9fa09d2295923bd5391954b7a06c183a89239c0b6ebdba41737260d3] START, creating Discoverer, Watch with file and sincedb collections
[2025-03-24T14:39:51,274][INFO ][logstash.agent           ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}

Can you provide more context on what you are trying to do and what is not working?

I am attempting to process .csv files that are dumped into a folder and convert those records into a JSON format that is then pumped to an HTTP output endpoint that will consume the data. So far, I've successfully been able to consume the .csv and output the .csv to both terminal and to a JSON File. I've additionally been able to successfully input feeds such as JSON data dumped to an Azure EventHub and an NSS syslog feed to the HTTP output. I've only ever run into this problem when I am trying to input the .csv and output it to the HTTP output. My initial assumption was that there might be some sort of timeout occurring, but none of the logs are showing any errors to that effect.

Those would be logged as ERROR or WARN.

Also, in the debug log you shared, no events were processed.

Can you add a new csv to the folder so logstash can process it and then share the result?

You have this. So, what are you getting to stdout ?

If nothing, then (probably) nothing is going to your HTTP endpoint either, and some issue earlier in input/filter stages.

If something, then maybe share a bit of that (obfuscated) output ... ?

The

if ![userName] {
 	drop {}
}

might also be a point of interest.

Also, change your mutate into this:

mutate {
	rename => { "Alert name" => "title"}
	rename => { "Tags" => "tags"}
	rename => { "Severity" => "severity"}
	rename => { "Investigation state" => "investigationState"}
	rename => { "Status" => "status"}
	rename => { "Category" => "category"}
	rename => { "Detection source" => "detectionSource"}
	rename => { "Impacted assets" => "impactedAssets"}
	rename => { "First activity" => "firstActivity"}
	rename => { "Last activity" => "lastActivity"}
	rename => { "Classification" => "classification"}
	rename => { "Determination" => "determination"}
	rename => { "Assignee" => "assignedTo"}
	remove_field => ["message", "[event][original]"]
}
mutate {
	split => ["impactedAssets", ","]
}
mutate {
	copy => {"[impactedAssets][0]" => "deviceName"}
  	copy => {"[impactedAssets][1]" => "userName"}
}		

You have multiple mutates on the same field, impactedAssets, that depends on the order, so they need to be in a different mutate block.

Each mutation must be in its own code block if the sequence of operations needs to be preserved.

Thanks @leandrojmp It was the multiple mutates! I'll keep an eye on the instance to make sure that this isn't a fluke.