Logstash isn't sending data to elastic

Hi, I've 1 vmq for Kibana+elasticsearch and 1 vm for Logstash.
Logstash isn't sending data to elastic, this is log:

[DEBUG][logstash.instrument.periodicpoller.cgroup] One or more required cgroup files or directories not found: /proc/self/cgroup, /sys/fs/cgroup/cpuacct, /sys/fs/cgroup/cpu

How can I fix?
logstash 8.6.2 no docker


This log is unrelated to your issue. If Logstash isn't sending data to Elasticsearch you will have more logs that reference the elasticsearch output, please share those logs.

Also, what does your logstash configuration looks like?

I've found this error:
"_id"=>nil, "status"=>400, "error"=>{"type"=>"illegal_argument_exception", "reason"=>"data_stream [logs-**************] must not contain the following characters ['\','/','','?','"','<','>','|',' ',',']"}}}

This is saying the data stream names contains illegal characters. What does your output filter look like?

I'm tryng to filter fortinet logs

You need to share your logstash configuration, the error says that you are trying to use unsupported characters in the name of the data_stream.

It is not possible to now how you are naming the data_stream unless you share your logstash configuration.

Could you suggest me how to fix first cgroup problem?
After I'll troubleshooting elk issue


This depends entirely on your infrastructure, you didn't say what is your VM OS, how you are using it.

But as I said, this is unrelated to your issue, if you want to fix it you should open another topic or edit this one.

Also, if you search the forum about this issue there will be a couple of posts about it.

I've Debian 11 vm
I hope to fix first problem because, when I'm changing config to fix second problem, logstash doesn't stop fine, I've to reboot vm, and This issue I can resolve, fixing first problem
I hope!!!!