How to feed entire logfile to elasticsearch as a message

'm trying to use logstash to feed a whole text file as a message into ElasticSearch. I have used multiline filter now as below.

multiline {
pattern => "/.*./gm"
negate => true
what => "previous"
}

This works for me. But my concern is:
The multiline filter document says "This filter has been deprecated in favor of multiline-codec. Multiline filter is not thread-safe and cannot handle messages from multiple streams"

I tried using multiline-codec with no luck.
Could you please let me know the alternative solution for multiline filter for the above scenario?

What do you mean?
Do you have a config and sample log you can show?

sorry for the delayed reply.
input
{
file
{
codec => multiline
{
pattern => "/.*./gm"
negate => true
what => "previous"
}
path => ["path_for_logs"]
start_position => "beginning"
sincedb_path => "/dev/null"
type => "LOG"

}

}

filter
{
if [type] == "LOG"{

 ruby 
    {
        code => "
        
        event['logfile_path'] = event['path'].strip
        "
    }
 
 mutate
    {
        gsub         => ["message", "\n", ""]
        gsub         => ["message", "\r", ""]
        add_field    => ["logdata", "%{message}"]
        remove_field => ["@version", "path", "host", tags, "message"]
    }

}

}

output
{
if [type] == "LOG"
{
stdout
{
codec => rubydebug
}
elasticsearch
{
#hosts => "127.0.0.1:9200"
#template_overwrite => true
template_name => "template_abc"
manage_template => true
template => "/opt/elasticsearch/config/templates/template_abc.json"
hosts => "127.0.0.1:9200"
index => "abc"
document_type => "logsearch"
document_id => "%{logfile_path}"
}
}
}

Sample Log:
This is any unstructured log(literally any file) that i want to index into elasticsearch docuement as a field.

Request help on this ASAP.
Whether is it possible to use multiline codec to push entire logfile(any file with no specific pattern) as a field to elasticsearch?