Did not find the logs in google-cloud-storage pushed by logstash

Hello, I recently opened a google cloud account and installed the plugins. Here is my code:
input{
udp{
port=>55514
type=>"syslog"
}
}
output{
google_cloud_storage {
bucket => "test-logstash-gcp-send-function"
json_key_file => "./gcp_account.json"
log_file_prefix => "logstash_gcs"
temp_directory => "/tmp/logstash-gcs"
date_pattern => "%Y-%m-%dT%H:00"
uploader_interval_secs => 10
#codec => line {format=> "[%{year}-%{month}-%{day}]Policy:%{policy} | Request Bytes: %{http_request_bytes} | Response Bytes: %{http_response_bytes}"}
}

}
Also here is the log:

[ec2-user@ip-xxx ~]$ /usr/share/logstash/bin/logstash -f logstash_syslog_gcp.config 
WARNING: Could not find logstash.yml which is typically located in $LS_HOME/config or /etc/logstash. You can specify the path using --path.settings. Continuing using the defaults
Could not find log4j2 configuration at path /usr/share/logstash/config/log4j2.properties. Using default config which logs errors to the console
[WARN ] 2021-08-31 17:28:42.188 [LogStash::Runner] multilocal - Ignoring the 'pipelines.yml' file because modules or command line options are specified
[INFO ] 2021-08-31 17:28:42.204 [LogStash::Runner] runner - Starting Logstash {"logstash.version"=>"6.7.0"}
[INFO ] 2021-08-31 17:28:50.387 [Converge PipelineAction::Create<main>] pipeline - Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>2, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50}
[INFO ] 2021-08-31 17:28:50.444 [[main]-pipeline-manager] googlecloudstorage - Using temporary directory: /tmp/logstash-gcs
[INFO ] 2021-08-31 17:28:50.595 [[main]-pipeline-manager] googlecloudstorage - Initializing Google API client, key: ./gcp_account.json
[INFO ] 2021-08-31 17:28:50.960 [Converge PipelineAction::Create<main>] pipeline - Pipeline started successfully {:pipeline_id=>"main", :thread=>"#<Thread:0x1306adc9 run>"}
[INFO ] 2021-08-31 17:28:51.141 [Ruby-0-Thread-1: /usr/share/logstash/lib/bootstrap/environment.rb:6] agent - Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[INFO ] 2021-08-31 17:28:51.178 [[main]<udp] udp - Starting UDP listener {:address=>"0.0.0.0:55514"}
[INFO ] 2021-08-31 17:28:51.268 [[main]<udp] udp - UDP listener started {:address=>"0.0.0.0:55514", :receive_buffer_bytes=>"106496", :queue_size=>"2000"}
[INFO ] 2021-08-31 17:28:51.488 [Api Webserver] agent - Successfully started Logstash API endpoint {:port=>9600}
^C[WARN ] 2021-08-31 17:30:52.611 [SIGINT handler] runner - SIGINT received. Shutting down.
[INFO ] 2021-08-31 17:30:53.677 [[main]-pipeline-manager] googlecloudstorage - Rotated out file: /tmp/logstash-gcs/xxxxxxxx_2021-08-31T17:00.part001.log
[INFO ] 2021-08-31 17:30:53.749 [pool-5-thread-1] googlecloudstorage - Uploading file to test-logstash-gcp-send-function/xxxxxxxx_2021-08-31T17:00.part001.log
[INFO ] 2021-08-31 17:30:54.787 [[main]-pipeline-manager] pipeline - Pipeline has terminated {:pipeline_id=>"main", :thread=>"#<Thread:0x1306adc9 run>"}

In the log it seems that xxxxxxxx_2021-08-31T17:00.part001.log is supposed to be pushed into bucket in google cloud storage test-logstash-gcp-send-function. However, the bucket test-logstash-gcp-send-function is still empty, which means no log has been pushed into this bucket. There is no error message or warning message showing that the upload is failed. Also bucket does not have any logs shows there is incoming file. I have also tested the uploading files in this bucket and it works fine. Could anyone tell me the reason?