Logstash configuration: filters using GROK different logs from one type of log file

#it showing double values in kibana help me out
input {
file {
path => "/var/log/aupm.log*"
}
}
filter {
grok {
match => { "message" => "%{DATESTAMP:timestamp} %{WORD:info} %{SPACE} %{NOTSPACE:http} %{GREEDYDATA:message} %{IP:client}" }
tag_on_failure => []
}

    grok {
            match => { "message" => "%{DATESTAMP:timestamp} %{WORD:info} %{SPACE} %{NOTSPACE:http} %{GREEDYDATA:message} %{IP:client} %{WORD:value}\[%{NUMBER:int}\]" }
            tag_on_failure => []
            }

    grok {
            match => { "message" => "%{DATESTAMP:timestamp} %{WORD:info} %{SPACE} %{NOTSPACE:http} %{GREEDYDATA:message}\[%{IP:client}\] %{GREEDYDATA:message1}" }
            tag_on_failure => []
            }

    grok {
            match => { "message" => "%{DATESTAMP:timestamp} %{WORD:info} %{SPACE} %{NOTSPACE:http} %{DATA:message}\[%{IP:client}\] %{WORD:type}=\[%{UUID:id}\]" }
            tag_on_failure => []
            }
grok {
	match => { "message" => "%{DATESTAMP:timestamp} %{WORD:info} %{SPACE} %{NOTSPACE:http} %{GREEDYDATA:message}\[%{IP:client}\]" }
	tag_on_failure => []
	}

}

You aren't really explaining what your problem is very well.

i'm having duplication in my kibana dashboard when i pull the data from log file

That's most likely because multiple grok filters match the same input string, and if you capture a string into the same string more than once you'll get an array.

The grok filter documentation contains examples of how to list multiple grok expressions in the same filter. Then grok will bail out on the first matched expression.

As he said, the documentation has this - Grok filter plugin | Logstash Reference [8.11] | Elastic

Thank you very much @magnusbaeck @warkolm