Output stdout does not print to shell when given info log level

The following script reads from mysql through jdbc plugin, and output every item through stdout. However, by running logstash -f mysql2es.conf, it does not print anything. Then I added --debug, I can see entities jdbc reads in.

input {
  jdbc {
    jdbc_driver_library => "mysql-connector-java-5.1.36-bin.jar"
    jdbc_driver_class => "com.mysql.jdbc.Driver"
    jdbc_connection_string => "jdbc:mysql://127.0.0.1:3306/aiplatform_face"
    jdbc_user => "root"
    jdbc_password => "mypass"
    statement => "SELECT id, create_time, vec FROM vec_table WHERE id > :sql_last_value LIMIT :size OFFSET :offset"
    jdbc_paging_enabled => true
    jdbc_paging_mode => "explicit"
    jdbc_page_size => 100000
    use_column_value => true
    tracking_column_type => "numeric"
    tracking_column => "id"
    last_run_metadata_path => "/root/tools/logstash/confs/test-jdbc-int-sql_last_value.yml"
  }
}

output {
  stdout {}
}

From below output, you may see that stdout is not printed.

[INFO ] 2023-02-24 10:00:15.959 [main] runner - JVM bootstrap flags: [-Xms1g, -Xmx1g, -Djava.awt.headless=true, -Dfile.encoding=UTF-8, -Djruby.compile.invokedynamic=true, -XX:+HeapDumpOnOutOfMemoryError, -Djava.security.egd=file:/dev/urandom, -Dlog4j2.isThreadContextMapInheritable=true, -Djruby.regexp.interruptible=true, -Djdk.io.File.enableADS=true, --add-exports=jdk.compiler/com.sun.tools.javac.api=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.file=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.parser=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.tree=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.util=ALL-UNNAMED, --add-opens=java.base/java.security=ALL-UNNAMED, --add-opens=java.base/java.io=ALL-UNNAMED, --add-opens=java.base/java.nio.channels=ALL-UNNAMED, --add-opens=java.base/sun.nio.ch=ALL-UNNAMED, --add-opens=java.management/sun.management=ALL-UNNAMED]
[WARN ] 2023-02-24 10:00:16.109 [LogStash::Runner] multilocal - Ignoring the 'pipelines.yml' file because modules or command line options are specified
[INFO ] 2023-02-24 10:00:16.493 [Api Webserver] agent - Successfully started Logstash API endpoint {:port=>9600, :ssl_enabled=>false}
[INFO ] 2023-02-24 10:00:16.768 [Converge PipelineAction::Create<main>] Reflections - Reflections took 88 ms to scan 1 urls, producing 127 keys and 444 values
[INFO ] 2023-02-24 10:00:17.067 [Converge PipelineAction::Create<main>] javapipeline - Pipeline `main` is configured with `pipeline.ecs_compatibility: v8` setting. All plugins in this pipeline will default to `ecs_compatibility => v8` unless explicitly configured otherwise.
[INFO ] 2023-02-24 10:00:17.082 [[main]-pipeline-manager] javapipeline - Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>500, "pipeline.sources"=>["/root/tools/logstash/pipeline/my2es.conf"], :thread=>"#<Thread:0x4fbe5496@/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:131 run>"}
[INFO ] 2023-02-24 10:00:17.468 [[main]-pipeline-manager] javapipeline - Pipeline Java execution initialization time {"seconds"=>0.39}
Loading class `com.mysql.jdbc.Driver'. This is deprecated. The new driver class is `com.mysql.cj.jdbc.Driver'. The driver is automatically registered via the SPI and manual loading of the driver class is generally unnecessary.
[INFO ] 2023-02-24 10:00:17.878 [[main]-pipeline-manager] jdbc - ECS compatibility is enabled but `target` option was not specified. This may cause fields to be set at the top-level of the event where they are likely to clash with the Elastic Common Schema. It is recommended to set the `target` option to avoid potential schema conflicts (if your data is ECS compliant or non-conflicting, feel free to ignore this message)
[INFO ] 2023-02-24 10:00:17.879 [[main]-pipeline-manager] javapipeline - Pipeline started {"pipeline.id"=>"main"}
[INFO ] 2023-02-24 10:00:17.889 [Agent thread] agent - Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[INFO ] 2023-02-24 10:00:17.896 [[main]<jdbc] jdbc - (0.010360s) SELECT id, create_time, vec FROM vec_table WHERE id > 349543 LIMIT 100000 OFFSET 0
[INFO ] 2023-02-24 10:00:17.994 [[main]-pipeline-manager] javapipeline - Pipeline terminated {"pipeline.id"=>"main"}
[INFO ] 2023-02-24 10:00:18.393 [Converge PipelineAction::Delete<main>] pipelinesregistry - Removed pipeline from registry successfully {:pipeline_id=>:main}
[INFO ] 2023-02-24 10:00:18.396 [LogStash::Runner] runner - Logstash shut down.

Now I get to know Logstash better as far as jdbc plugin. I think what I observed earlier as stated in that post, was because mysql table records have been scanned, and no more data after last indictor. Thereafter, Logstash did not reads any record into this tool, so it did not print any.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.