How can i parse my logs using custom log integration

i want to parse my custom logs symfony which is an application logs. As it doesn't have integration so trying to use custom integration yes the log is getting stored in elasticsearch but not parsing it i mean the whole log entry is inside the message field i tried to add processor to the custom integration
hen i have tried with the processor

{
"patterns": [
"\[%{TIMESTAMP_ISO8601:timestamp}\]\[\] \[ HOST: %{DATA:host} REQUEST URI: %{URIPATHPARAM:request_uri}\] %{DATA:loglevel}: %{GREEDYDATA:error_message}"
]
}

it was not gave any error the default dataset logs-generic-default it stores all the log entry in message field still. when i try to create a index with only those specific fields and add that in custom log integration dataset that index name itself is not available in data view
please help to solve this issues
i have created a pipeline and i tried to mention in configuration in custom log integration

pipeline: "my-custom-pipeline"
multiline.pattern: '^\['
multiline.negate: true
multiline.match: after

But still it was storing the whole log entry in message field idk how to use that custom log integration correctly to parse my logs

please explain please

i have created a pipeline

PUT _ingest/pipeline/custom_log_pipeline3
{
"description": "Pipeline for processing custom logs",
"processors": [
{
"grok": {
"field": "message",
"patterns": [
"\[%{TIMESTAMP_ISO8601:timestamp}\]\s*\[%{DATA:loglevel}\]\s*\[ HOST: %{HOSTNAME:host}\s<em>REQUEST URI: %{URIPATHPARAM:request_uri}\] %{GREEDYDATA:message}"
]
}
}
]
}

and mention that in custom configuration in custom log integration and i have created a index pattern

PUT /myappo1
{
"mappings": {
"properties": {
"timestamp": {
"type": "date",
"format": "yyyy-MM-dd HH:mm:ss"
},
"loglevel": {
"type": "keyword"
},
"host": {
"type": "keyword"
},
"request_uri": {
"type": "text"
},
"message": {
"type": "text"
}
}
}
}

i have tested manually posting the log in that index and pipeline its working fine but it not working in custom log integration. And in custom configuration of custom log integration i have mentioned

pipeline: custom_log_pipeline3

I mentioned myappo1 dataset in custom log integration. after attaching it to my agent its creating a new index like logs-mappo1-* in that there are no fields like "request_uri", "loglevel" ,"timestamp",Host.name the fields which i want!!