i want to parse my custom logs symfony which is an application logs. As it doesn't have integration so trying to use custom integration yes the log is getting stored in elasticsearch but not parsing it i mean the whole log entry is inside the message field i tried to add processor to the custom integration
hen i have tried with the processor
{
"patterns": [
"\[%{TIMESTAMP_ISO8601:timestamp}\]\[\] \[ HOST: %{DATA:host} REQUEST URI: %{URIPATHPARAM:request_uri}\] %{DATA:loglevel}: %{GREEDYDATA:error_message}"
]
}
it was not gave any error the default dataset logs-generic-default it stores all the log entry in message field still. when i try to create a index with only those specific fields and add that in custom log integration dataset that index name itself is not available in data view
please help to solve this issues
i have created a pipeline and i tried to mention in configuration in custom log integration
pipeline: "my-custom-pipeline"
multiline.pattern: '^\['
multiline.negate: true
multiline.match: after
But still it was storing the whole log entry in message field idk how to use that custom log integration correctly to parse my logs
please explain please