I got this error in using logstash

Hello.
I installed the logstash and set the configuration.

# Sample Logstash configuration for creating a simple
# Beats -> Logstash -> Elasticsearch pipeline.

input {
  file{
  	type =>"csv"
    path => "Z:/5/upwork/ctf/ThreatHound-main/sysmon.csv"
    #path => "Z:\5\upwork\ctf\ThreatHound-main\sysmon.csv"
    start_position => "beginning"
    }
}
filter {
    csv {
        separator => ","
        #columns => ["Level", "Date and Time", "Source", "Event ID", "Task Category"]
        }
}
output {
  tcp {
    host => "localhost"
    port => 12345
    codec => json_lines
  }
}

and I was running the python tcp server.

Microsoft Windows [Version 10.0.19045.3693]
(c) Microsoft Corporation. All rights reserved.

Z:\5\upwork\ctf\ThreatHound-main>python server.py
Listening on port 12345...

I run the logstash, I got no data.

E:\logstash-8.11.1>bin\logstash -f config\logstash.conf --debug
"Using bundled JDK: E:\logstash-8.11.1\jdk\bin\java.exe"
Sending Logstash logs to E:/logstash-8.11.1/logs which is now configured via log4j2.properties
[2023-12-05T10:55:59,886][INFO ][logstash.runner          ] Log4j configuration path used is: E:\logstash-8.11.1\config\log4j2.properties
[2023-12-05T10:55:59,887][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"8.11.1", "jruby.version"=>"jruby 9.4.2.0 (3.1.0) 2023-03-08 90d2913fda OpenJDK 64-Bit Server VM 17.0.9+9 on 17.0.9+9 +indy +jit [x86_64-mswin32]"}
[2023-12-05T10:55:59,888][INFO ][logstash.runner          ] JVM bootstrap flags: [-Xms1g, -Xmx1g, -Djava.awt.headless=true, -Dfile.encoding=UTF-8, -Djruby.compile.invokedynamic=true, -XX:+HeapDumpOnOutOfMemoryError, -Djava.security.egd=file:/dev/urandom, -Dlog4j2.isThreadContextMapInheritable=true, -Djruby.regexp.interruptible=true, -Djdk.io.File.enableADS=true, --add-exports=jdk.compiler/com.sun.tools.javac.api=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.file=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.parser=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.tree=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.util=ALL-UNNAMED, --add-opens=java.base/java.security=ALL-UNNAMED, --add-opens=java.base/java.io=ALL-UNNAMED, --add-opens=java.base/java.nio.channels=ALL-UNNAMED, --add-opens=java.base/sun.nio.ch=ALL-UNNAMED, --add-opens=java.management/sun.management=ALL-UNNAMED]
[2023-12-05T10:55:59,893][DEBUG][logstash.modules.scaffold] Found module {:module_name=>"fb_apache", :directory=>"E:/logstash-8.11.1/modules/fb_apache/configuration"}
[2023-12-05T10:55:59,893][DEBUG][logstash.plugins.registry] Adding plugin to the registry {:name=>"fb_apache", :type=>:modules, :class=>#<LogStash::Modules::Scaffold:0x45e4dc42 @directory="E:/logstash-8.11.1/modules/fb_apache/configuration", @module_name="fb_apache", @kibana_version_parts=["6", "0", "0"]>}
[2023-12-05T10:55:59,894][DEBUG][logstash.modules.scaffold] Found module {:module_name=>"netflow", :directory=>"E:/logstash-8.11.1/modules/netflow/configuration"}
[2023-12-05T10:55:59,894][DEBUG][logstash.plugins.registry] Adding plugin to the registry {:name=>"netflow", :type=>:modules, :class=>#<LogStash::Modules::Scaffold:0x39c65cb4 @directory="E:/logstash-8.11.1/modules/netflow/configuration", @module_name="netflow", @kibana_version_parts=["6", "0", "0"]>}
[2023-12-05T10:55:59,898][DEBUG][logstash.runner          ] Setting global FieldReference escape style: none
[2023-12-05T10:55:59,911][DEBUG][logstash.runner          ] -------- Logstash Settings (* means modified) ---------
[2023-12-05T10:55:59,911][DEBUG][logstash.runner          ] allow_superuser: true
[2023-12-05T10:55:59,911][DEBUG][logstash.runner          ] node.name: "DESKTOP-86JDRSU"
[2023-12-05T10:55:59,912][DEBUG][logstash.runner          ] *path.config: "config\\logstash.conf"
[2023-12-05T10:55:59,912][DEBUG][logstash.runner          ] path.data: "E:/logstash-8.11.1/data"
[2023-12-05T10:55:59,912][DEBUG][logstash.runner          ] modules.cli: #<Java::OrgLogstashUtil::ModulesSettingArray: []>
[2023-12-05T10:55:59,913][DEBUG][logstash.runner          ] modules: []
[2023-12-05T10:55:59,913][DEBUG][logstash.runner          ] modules_list: []
[2023-12-05T10:55:59,913][DEBUG][logstash.runner          ] modules_variable_list: []
[2023-12-05T10:55:59,914][DEBUG][logstash.runner          ] modules_setup: false
[2023-12-05T10:55:59,915][DEBUG][logstash.runner          ] config.test_and_exit: false
[2023-12-05T10:55:59,916][DEBUG][logstash.runner          ] config.reload.automatic: false
[2023-12-05T10:55:59,916][DEBUG][logstash.runner          ] config.reload.interval: #<Java::OrgLogstashUtil::TimeValue:0x1313d0b3>
[2023-12-05T10:55:59,916][DEBUG][logstash.runner          ] config.support_escapes: false
[2023-12-05T10:55:59,916][DEBUG][logstash.runner          ] config.field_reference.escape_style: "none"
[2023-12-05T10:55:59,917][DEBUG][logstash.runner          ] event_api.tags.illegal: "rename"
[2023-12-05T10:55:59,917][DEBUG][logstash.runner          ] metric.collect: true
[2023-12-05T10:55:59,917][DEBUG][logstash.runner          ] pipeline.id: "main"
[2023-12-05T10:55:59,917][DEBUG][logstash.runner          ] pipeline.system: false
[2023-12-05T10:55:59,917][DEBUG][logstash.runner          ] pipeline.workers: 16
[2023-12-05T10:55:59,918][DEBUG][logstash.runner          ] pipeline.batch.size: 125
[2023-12-05T10:55:59,925][DEBUG][logstash.runner          ] pipeline.batch.delay: 50
[2023-12-05T10:55:59,926][DEBUG][logstash.runner          ] pipeline.unsafe_shutdown: false
[2023-12-05T10:55:59,926][DEBUG][logstash.runner          ] pipeline.reloadable: true
[2023-12-05T10:55:59,926][DEBUG][logstash.runner          ] pipeline.plugin_classloaders: false
[2023-12-05T10:55:59,926][DEBUG][logstash.runner          ] pipeline.separate_logs: false
[2023-12-05T10:55:59,927][DEBUG][logstash.runner          ] pipeline.ordered: "auto"
[2023-12-05T10:55:59,927][DEBUG][logstash.runner          ] pipeline.ecs_compatibility: "v8"
[2023-12-05T10:55:59,927][DEBUG][logstash.runner          ] path.plugins: []
[2023-12-05T10:55:59,929][DEBUG][logstash.runner          ] config.debug: false
[2023-12-05T10:55:59,929][DEBUG][logstash.runner          ] *log.level: "debug" (default: "info")
[2023-12-05T10:55:59,929][DEBUG][logstash.runner          ] version: false
[2023-12-05T10:55:59,930][DEBUG][logstash.runner          ] help: false
[2023-12-05T10:55:59,930][DEBUG][logstash.runner          ] enable-local-plugin-development: false
[2023-12-05T10:55:59,930][DEBUG][logstash.runner          ] log.format: "plain"
[2023-12-05T10:55:59,930][DEBUG][logstash.runner          ] api.enabled: true
[2023-12-05T10:55:59,931][DEBUG][logstash.runner          ] api.http.host: "127.0.0.1"
[2023-12-05T10:55:59,931][DEBUG][logstash.runner          ] api.http.port: 9600..9700
[2023-12-05T10:55:59,931][DEBUG][logstash.runner          ] api.environment: "production"
[2023-12-05T10:55:59,932][DEBUG][logstash.runner          ] api.auth.type: "none"
[2023-12-05T10:55:59,932][DEBUG][logstash.runner          ] api.auth.basic.password_policy.mode: "WARN"
[2023-12-05T10:55:59,932][DEBUG][logstash.runner          ] api.auth.basic.password_policy.length.minimum: 8
[2023-12-05T10:55:59,933][DEBUG][logstash.runner          ] api.auth.basic.password_policy.include.upper: "REQUIRED"
[2023-12-05T10:55:59,938][DEBUG][logstash.runner          ] api.auth.basic.password_policy.include.lower: "REQUIRED"
[2023-12-05T10:55:59,939][DEBUG][logstash.runner          ] api.auth.basic.password_policy.include.digit: "REQUIRED"
[2023-12-05T10:55:59,939][DEBUG][logstash.runner          ] api.auth.basic.password_policy.include.symbol: "OPTIONAL"
[2023-12-05T10:55:59,940][DEBUG][logstash.runner          ] api.ssl.enabled: false
[2023-12-05T10:55:59,940][DEBUG][logstash.runner          ] api.ssl.supported_protocols: []
[2023-12-05T10:55:59,940][DEBUG][logstash.runner          ] queue.type: "memory"
[2023-12-05T10:55:59,941][DEBUG][logstash.runner          ] queue.drain: false
[2023-12-05T10:55:59,941][DEBUG][logstash.runner          ] queue.page_capacity: 67108864
[2023-12-05T10:55:59,947][DEBUG][logstash.runner          ] queue.max_bytes: 1073741824
[2023-12-05T10:55:59,947][DEBUG][logstash.runner          ] queue.max_events: 0
[2023-12-05T10:55:59,947][DEBUG][logstash.runner          ] queue.checkpoint.acks: 1024
[2023-12-05T10:55:59,948][DEBUG][logstash.runner          ] queue.checkpoint.writes: 1024
[2023-12-05T10:55:59,948][DEBUG][logstash.runner          ] queue.checkpoint.interval: 1000
[2023-12-05T10:55:59,948][DEBUG][logstash.runner          ] queue.checkpoint.retry: true
[2023-12-05T10:55:59,951][DEBUG][logstash.runner          ] dead_letter_queue.enable: false
[2023-12-05T10:55:59,959][DEBUG][logstash.runner          ] dead_letter_queue.max_bytes: 1073741824
[2023-12-05T10:55:59,959][DEBUG][logstash.runner          ] dead_letter_queue.flush_interval: 5000
[2023-12-05T10:55:59,959][DEBUG][logstash.runner          ] dead_letter_queue.storage_policy: "drop_newer"
[2023-12-05T10:55:59,960][DEBUG][logstash.runner          ] slowlog.threshold.warn: #<Java::OrgLogstashUtil::TimeValue:0x20eee47c>
[2023-12-05T10:55:59,960][DEBUG][logstash.runner          ] slowlog.threshold.info: #<Java::OrgLogstashUtil::TimeValue:0x6d88a3f4>
[2023-12-05T10:55:59,960][DEBUG][logstash.runner          ] slowlog.threshold.debug: #<Java::OrgLogstashUtil::TimeValue:0x3e9a0143>
[2023-12-05T10:55:59,961][DEBUG][logstash.runner          ] slowlog.threshold.trace: #<Java::OrgLogstashUtil::TimeValue:0x4cf1ba4e>
[2023-12-05T10:55:59,961][DEBUG][logstash.runner          ] keystore.classname: "org.logstash.secret.store.backend.JavaKeyStore"
[2023-12-05T10:55:59,962][DEBUG][logstash.runner          ] keystore.file: "E:/logstash-8.11.1/config/logstash.keystore"
[2023-12-05T10:55:59,962][DEBUG][logstash.runner          ] path.queue: "E:/logstash-8.11.1/data/queue"
[2023-12-05T10:55:59,962][DEBUG][logstash.runner          ] path.dead_letter_queue: "E:/logstash-8.11.1/data/dead_letter_queue"
[2023-12-05T10:55:59,962][DEBUG][logstash.runner          ] path.settings: "E:/logstash-8.11.1/config"
[2023-12-05T10:55:59,963][DEBUG][logstash.runner          ] path.logs: "E:/logstash-8.11.1/logs"
[2023-12-05T10:55:59,964][DEBUG][logstash.runner          ] xpack.monitoring.enabled: false
[2023-12-05T10:55:59,965][DEBUG][logstash.runner          ] xpack.monitoring.elasticsearch.hosts: ["http://localhost:9200"]
[2023-12-05T10:55:59,965][DEBUG][logstash.runner          ] xpack.monitoring.collection.interval: #<Java::OrgLogstashUtil::TimeValue:0x5882ff51>
[2023-12-05T10:55:59,972][DEBUG][logstash.runner          ] xpack.monitoring.collection.timeout_interval: #<Java::OrgLogstashUtil::TimeValue:0xef3bc19>
[2023-12-05T10:55:59,973][DEBUG][logstash.runner          ] xpack.monitoring.elasticsearch.username: "logstash_system"
[2023-12-05T10:55:59,973][DEBUG][logstash.runner          ] xpack.monitoring.elasticsearch.ssl.verification_mode: "full"
[2023-12-05T10:55:59,973][DEBUG][logstash.runner          ] xpack.monitoring.elasticsearch.ssl.cipher_suites: []
[2023-12-05T10:55:59,973][DEBUG][logstash.runner          ] xpack.monitoring.elasticsearch.sniffing: false
[2023-12-05T10:55:59,973][DEBUG][logstash.runner          ] xpack.monitoring.collection.pipeline.details.enabled: true
[2023-12-05T10:55:59,973][DEBUG][logstash.runner          ] xpack.monitoring.collection.config.enabled: true
[2023-12-05T10:55:59,974][DEBUG][logstash.runner          ] monitoring.enabled: false
[2023-12-05T10:55:59,974][DEBUG][logstash.runner          ] monitoring.elasticsearch.hosts: ["http://localhost:9200"]
[2023-12-05T10:55:59,974][DEBUG][logstash.runner          ] monitoring.collection.interval: #<Java::OrgLogstashUtil::TimeValue:0x554d4856>
[2023-12-05T10:55:59,974][DEBUG][logstash.runner          ] monitoring.collection.timeout_interval: #<Java::OrgLogstashUtil::TimeValue:0x2a673ee8>
[2023-12-05T10:55:59,975][DEBUG][logstash.runner          ] monitoring.elasticsearch.username: "logstash_system"
[2023-12-05T10:55:59,975][DEBUG][logstash.runner          ] monitoring.elasticsearch.ssl.verification_mode: "full"
[2023-12-05T10:55:59,975][DEBUG][logstash.runner          ] monitoring.elasticsearch.ssl.cipher_suites: []
[2023-12-05T10:55:59,975][DEBUG][logstash.runner          ] monitoring.elasticsearch.sniffing: false
[2023-12-05T10:55:59,975][DEBUG][logstash.runner          ] monitoring.collection.pipeline.details.enabled: true
[2023-12-05T10:55:59,976][DEBUG][logstash.runner          ] monitoring.collection.config.enabled: true
[2023-12-05T10:55:59,976][DEBUG][logstash.runner          ] node.uuid: ""
[2023-12-05T10:55:59,978][DEBUG][logstash.runner          ] xpack.management.enabled: false
[2023-12-05T10:55:59,978][DEBUG][logstash.runner          ] xpack.management.logstash.poll_interval: #<Java::OrgLogstashUtil::TimeValue:0x320aecd3>
[2023-12-05T10:55:59,978][DEBUG][logstash.runner          ] xpack.management.pipeline.id: ["main"]
[2023-12-05T10:55:59,978][DEBUG][logstash.runner          ] xpack.management.elasticsearch.username: "logstash_system"
[2023-12-05T10:55:59,978][DEBUG][logstash.runner          ] xpack.management.elasticsearch.hosts: ["https://localhost:9200"]
[2023-12-05T10:55:59,979][DEBUG][logstash.runner          ] xpack.management.elasticsearch.ssl.cipher_suites: []
[2023-12-05T10:55:59,979][DEBUG][logstash.runner          ] xpack.management.elasticsearch.ssl.verification_mode: "full"
[2023-12-05T10:55:59,979][DEBUG][logstash.runner          ] xpack.management.elasticsearch.sniffing: false
[2023-12-05T10:55:59,979][DEBUG][logstash.runner          ] xpack.geoip.downloader.enabled: true
[2023-12-05T10:55:59,979][DEBUG][logstash.runner          ] --------------- Logstash Settings -------------------
[2023-12-05T10:55:59,981][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2023-12-05T10:56:00,001][DEBUG][logstash.agent           ] Initializing API WebServer {"api.http.host"=>"127.0.0.1", "api.http.port"=>9600..9700, "api.ssl.enabled"=>false, "api.auth.type"=>"none", "api.environment"=>"production"}
[2023-12-05T10:56:00,008][DEBUG][logstash.api.service     ] [api-service] start
[2023-12-05T10:56:00,046][DEBUG][logstash.agent           ] Setting up metric collection
[2023-12-05T10:56:00,048][DEBUG][logstash.instrument.periodicpoller.os] Starting {:polling_interval=>5, :polling_timeout=>120}
[2023-12-05T10:56:00,049][DEBUG][logstash.instrument.periodicpoller.cgroup] One or more required cgroup files or directories not found: /proc/self/cgroup, /sys/fs/cgroup/cpuacct, /sys/fs/cgroup/cpu
[2023-12-05T10:56:00,065][DEBUG][logstash.instrument.periodicpoller.jvm] Starting {:polling_interval=>5, :polling_timeout=>120}
[2023-12-05T10:56:00,231][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"G1 Young Generation"}
[2023-12-05T10:56:00,233][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"G1 Old Generation"}
[2023-12-05T10:56:00,239][DEBUG][logstash.instrument.periodicpoller.persistentqueue] Starting {:polling_interval=>5, :polling_timeout=>120}
[2023-12-05T10:56:00,241][DEBUG][logstash.instrument.periodicpoller.deadletterqueue] Starting {:polling_interval=>5, :polling_timeout=>120}
[2023-12-05T10:56:00,241][DEBUG][logstash.instrument.periodicpoller.flowrate] Starting {:polling_interval=>5, :polling_timeout=>120}
[2023-12-05T10:56:00,499][DEBUG][logstash.agent           ] Starting agent
[2023-12-05T10:56:00,502][DEBUG][logstash.agent           ] Starting API WebServer (puma)
[2023-12-05T10:56:00,503][DEBUG][logstash.config.source.local.configpathloader] Skipping the following files while reading config since they don't match the specified glob pattern {:files=>["E:/logstash-8.11.1/config/jvm.options", "E:/logstash-8.11.1/config/log4j2.properties", "E:/logstash-8.11.1/config/logstash-sample.conf", "E:/logstash-8.11.1/config/logstash.yml", "E:/logstash-8.11.1/config/pipelines.yml", "E:/logstash-8.11.1/config/startup.options"]}
[2023-12-05T10:56:00,508][DEBUG][logstash.config.source.local.configpathloader] Reading config file {:config_file=>"E:/logstash-8.11.1/config/logstash.conf"}
[2023-12-05T10:56:00,509][DEBUG][logstash.agent           ] Trying to start API WebServer {:port=>9600, :ssl_enabled=>false}
[2023-12-05T10:56:00,516][DEBUG][logstash.agent           ] Converging pipelines state {:actions_count=>1}
[2023-12-05T10:56:00,525][DEBUG][logstash.agent           ] Executing action {:action=>LogStash::PipelineAction::Create/pipeline_id:main}
[2023-12-05T10:56:00,532][DEBUG][org.logstash.secret.store.SecretStoreFactory] Attempting to exists or secret store with implementation: org.logstash.secret.store.backend.JavaKeyStore
[2023-12-05T10:56:00,593][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600, :ssl_enabled=>false}
[2023-12-05T10:56:00,759][INFO ][org.reflections.Reflections] Reflections took 59 ms to scan 1 urls, producing 132 keys and 464 values
[2023-12-05T10:56:00,774][DEBUG][org.logstash.secret.store.SecretStoreFactory] Attempting to exists or secret store with implementation: org.logstash.secret.store.backend.JavaKeyStore
[2023-12-05T10:56:00,847][DEBUG][logstash.plugins.registry] On demand adding plugin to the registry {:name=>"file", :type=>"input", :class=>LogStash::Inputs::File}
[2023-12-05T10:56:00,898][DEBUG][logstash.plugins.registry] On demand adding plugin to the registry {:name=>"plain", :type=>"codec", :class=>LogStash::Codecs::Plain}
[2023-12-05T10:56:00,903][DEBUG][logstash.codecs.plain    ] config LogStash::Codecs::Plain/@id = "plain_5d13a9d5-3e58-4766-8222-f0527aa6b333"
[2023-12-05T10:56:00,904][DEBUG][logstash.codecs.plain    ] config LogStash::Codecs::Plain/@enable_metric = true
[2023-12-05T10:56:00,904][DEBUG][logstash.codecs.plain    ] config LogStash::Codecs::Plain/@charset = "UTF-8"
[2023-12-05T10:56:00,916][DEBUG][logstash.inputs.file     ] config LogStash::Inputs::File/@start_position = "beginning"
[2023-12-05T10:56:00,918][DEBUG][logstash.inputs.file     ] config LogStash::Inputs::File/@path = ["Z:/5/upwork/ctf/ThreatHound-main/sysmon1.csv"]
[2023-12-05T10:56:00,918][DEBUG][logstash.inputs.file     ] config LogStash::Inputs::File/@id = "7b1180756692affca0042711147933de5961a6fe3eba697acd9b7a0edcd9ca2f"
[2023-12-05T10:56:00,923][DEBUG][logstash.inputs.file     ] config LogStash::Inputs::File/@type = "csv"
[2023-12-05T10:56:00,923][DEBUG][logstash.inputs.file     ] config LogStash::Inputs::File/@enable_metric = true
[2023-12-05T10:56:00,924][DEBUG][logstash.inputs.file     ] config LogStash::Inputs::File/@codec = <LogStash::Codecs::Plain id=>"plain_5d13a9d5-3e58-4766-8222-f0527aa6b333", enable_metric=>true, charset=>"UTF-8">
[2023-12-05T10:56:00,924][DEBUG][logstash.inputs.file     ] config LogStash::Inputs::File/@add_field = {}
[2023-12-05T10:56:00,924][DEBUG][logstash.inputs.file     ] config LogStash::Inputs::File/@stat_interval = 1.0
[2023-12-05T10:56:00,924][DEBUG][logstash.inputs.file     ] config LogStash::Inputs::File/@discover_interval = 15
[2023-12-05T10:56:00,925][DEBUG][logstash.inputs.file     ] config LogStash::Inputs::File/@sincedb_write_interval = 15.0
[2023-12-05T10:56:00,925][DEBUG][logstash.inputs.file     ] config LogStash::Inputs::File/@delimiter = "\n"
[2023-12-05T10:56:00,925][DEBUG][logstash.inputs.file     ] config LogStash::Inputs::File/@close_older = 3600.0
[2023-12-05T10:56:00,925][DEBUG][logstash.inputs.file     ] config LogStash::Inputs::File/@mode = "tail"
[2023-12-05T10:56:00,926][DEBUG][logstash.inputs.file     ] config LogStash::Inputs::File/@file_completed_action = "delete"
[2023-12-05T10:56:00,926][DEBUG][logstash.inputs.file     ] config LogStash::Inputs::File/@sincedb_clean_after = 1209600.0
[2023-12-05T10:56:00,926][DEBUG][logstash.inputs.file     ] config LogStash::Inputs::File/@file_chunk_size = 32768
[2023-12-05T10:56:00,927][DEBUG][logstash.inputs.file     ] config LogStash::Inputs::File/@file_chunk_count = 140737488355327
[2023-12-05T10:56:00,927][DEBUG][logstash.inputs.file     ] config LogStash::Inputs::File/@file_sort_by = "last_modified"
[2023-12-05T10:56:00,929][DEBUG][logstash.inputs.file     ] config LogStash::Inputs::File/@file_sort_direction = "asc"
[2023-12-05T10:56:00,929][DEBUG][logstash.inputs.file     ] config LogStash::Inputs::File/@exit_after_read = false
[2023-12-05T10:56:00,929][DEBUG][logstash.inputs.file     ] config LogStash::Inputs::File/@check_archive_validity = false
[2023-12-05T10:56:00,951][DEBUG][logstash.plugins.registry] On demand adding plugin to the registry {:name=>"csv", :type=>"filter", :class=>LogStash::Filters::CSV}
[2023-12-05T10:56:00,976][DEBUG][logstash.filters.csv     ] config LogStash::Filters::CSV/@separator = ","
[2023-12-05T10:56:00,976][DEBUG][logstash.filters.csv     ] config LogStash::Filters::CSV/@id = "e1ef443fd589f441fa6c6acf8f0a9648df6069b979a3451fd54adcef323e112d"
[2023-12-05T10:56:00,976][DEBUG][logstash.filters.csv     ] config LogStash::Filters::CSV/@enable_metric = true
[2023-12-05T10:56:00,977][DEBUG][logstash.filters.csv     ] config LogStash::Filters::CSV/@add_tag = []
[2023-12-05T10:56:00,977][DEBUG][logstash.filters.csv     ] config LogStash::Filters::CSV/@remove_tag = []
[2023-12-05T10:56:00,977][DEBUG][logstash.filters.csv     ] config LogStash::Filters::CSV/@add_field = {}
[2023-12-05T10:56:00,978][DEBUG][logstash.filters.csv     ] config LogStash::Filters::CSV/@remove_field = []
[2023-12-05T10:56:00,978][DEBUG][logstash.filters.csv     ] config LogStash::Filters::CSV/@periodic_flush = false
[2023-12-05T10:56:00,978][DEBUG][logstash.filters.csv     ] config LogStash::Filters::CSV/@source = "message"
[2023-12-05T10:56:00,978][DEBUG][logstash.filters.csv     ] config LogStash::Filters::CSV/@columns = []
[2023-12-05T10:56:00,979][DEBUG][logstash.filters.csv     ] config LogStash::Filters::CSV/@quote_char = "\""
[2023-12-05T10:56:00,988][DEBUG][logstash.filters.csv     ] config LogStash::Filters::CSV/@autogenerate_column_names = true
[2023-12-05T10:56:00,988][DEBUG][logstash.filters.csv     ] config LogStash::Filters::CSV/@skip_header = false
[2023-12-05T10:56:00,988][DEBUG][logstash.filters.csv     ] config LogStash::Filters::CSV/@skip_empty_columns = false
[2023-12-05T10:56:00,988][DEBUG][logstash.filters.csv     ] config LogStash::Filters::CSV/@skip_empty_rows = false
[2023-12-05T10:56:00,988][DEBUG][logstash.filters.csv     ] config LogStash::Filters::CSV/@convert = {}
[2023-12-05T10:56:00,989][DEBUG][logstash.filters.csv     ] config LogStash::Filters::CSV/@autodetect_column_names = false
[2023-12-05T10:56:00,998][DEBUG][logstash.plugins.registry] On demand adding plugin to the registry {:name=>"tcp", :type=>"output", :class=>LogStash::Outputs::Tcp}
[2023-12-05T10:56:01,019][DEBUG][logstash.plugins.registry] On demand adding plugin to the registry {:name=>"json_lines", :type=>"codec", :class=>LogStash::Codecs::JSONLines}
[2023-12-05T10:56:01,026][DEBUG][logstash.codecs.jsonlines] config LogStash::Codecs::JSONLines/@id = "json_lines_c9f32bc4-2d4c-4d66-b302-7f247f541c9a"
[2023-12-05T10:56:01,028][DEBUG][logstash.codecs.jsonlines] config LogStash::Codecs::JSONLines/@enable_metric = true
[2023-12-05T10:56:01,028][DEBUG][logstash.codecs.jsonlines] config LogStash::Codecs::JSONLines/@charset = "UTF-8"
[2023-12-05T10:56:01,035][DEBUG][logstash.codecs.jsonlines] config LogStash::Codecs::JSONLines/@delimiter = "\n"
[2023-12-05T10:56:01,036][INFO ][logstash.codecs.jsonlines] ECS compatibility is enabled but `target` option was not specified. This may cause fields to be set at the top-level of the event where they are likely to clash with the Elastic Common Schema. It is recommended to set the `target` option to avoid potential schema conflicts (if your data is ECS compliant or non-conflicting, feel free to ignore this message)
[2023-12-05T10:56:01,043][DEBUG][logstash.outputs.tcp     ] config LogStash::Outputs::Tcp/@host = "localhost"
[2023-12-05T10:56:01,045][DEBUG][logstash.outputs.tcp     ] config LogStash::Outputs::Tcp/@codec = <LogStash::Codecs::JSONLines id=>"json_lines_c9f32bc4-2d4c-4d66-b302-7f247f541c9a", enable_metric=>true, charset=>"UTF-8", delimiter=>"\n">
[2023-12-05T10:56:01,052][DEBUG][logstash.outputs.tcp     ] config LogStash::Outputs::Tcp/@id = "aca75b1c386c2665d4cb58cc3ffa2e29cfe7e80449ada242df6b897831b38371"
[2023-12-05T10:56:01,052][DEBUG][logstash.outputs.tcp     ] config LogStash::Outputs::Tcp/@port = 12346
[2023-12-05T10:56:01,052][DEBUG][logstash.outputs.tcp     ] config LogStash::Outputs::Tcp/@enable_metric = true
[2023-12-05T10:56:01,053][DEBUG][logstash.outputs.tcp     ] config LogStash::Outputs::Tcp/@workers = 1
[2023-12-05T10:56:01,053][DEBUG][logstash.outputs.tcp     ] config LogStash::Outputs::Tcp/@reconnect_interval = 10
[2023-12-05T10:56:01,053][DEBUG][logstash.outputs.tcp     ] config LogStash::Outputs::Tcp/@mode = "client"
[2023-12-05T10:56:01,056][DEBUG][logstash.outputs.tcp     ] config LogStash::Outputs::Tcp/@ssl_enable = false
[2023-12-05T10:56:01,056][DEBUG][logstash.outputs.tcp     ] config LogStash::Outputs::Tcp/@ssl_verify = false
[2023-12-05T10:56:01,056][DEBUG][logstash.outputs.tcp     ] config LogStash::Outputs::Tcp/@ssl_key_passphrase = <password>
[2023-12-05T10:56:01,056][DEBUG][logstash.outputs.tcp     ] config LogStash::Outputs::Tcp/@ssl_supported_protocols = []
[2023-12-05T10:56:01,070][INFO ][logstash.javapipeline    ] Pipeline `main` is configured with `pipeline.ecs_compatibility: v8` setting. All plugins in this pipeline will default to `ecs_compatibility => v8` unless explicitly configured otherwise.
[2023-12-05T10:56:01,073][DEBUG][org.logstash.execution.AbstractPipelineExt] Flow metric registered: `input_throughput` in namespace `[:stats, :pipelines, :main, :flow]`
[2023-12-05T10:56:01,076][DEBUG][org.logstash.execution.AbstractPipelineExt] Flow metric registered: `filter_throughput` in namespace `[:stats, :pipelines, :main, :flow]`
[2023-12-05T10:56:01,076][DEBUG][org.logstash.execution.AbstractPipelineExt] Flow metric registered: `output_throughput` in namespace `[:stats, :pipelines, :main, :flow]`
[2023-12-05T10:56:01,082][DEBUG][org.logstash.execution.AbstractPipelineExt] Flow metric registered: `queue_backpressure` in namespace `[:stats, :pipelines, :main, :flow]`
[2023-12-05T10:56:01,083][DEBUG][org.logstash.execution.AbstractPipelineExt] Flow metric registered: `worker_concurrency` in namespace `[:stats, :pipelines, :main, :flow]`
[2023-12-05T10:56:01,083][DEBUG][org.logstash.execution.AbstractPipelineExt] Flow metric registered: `throughput` in namespace `[:stats, :pipelines, :main, :plugins, :inputs, :"7b1180756692affca0042711147933de5961a6fe3eba697acd9b7a0edcd9ca2f", :flow]`
[2023-12-05T10:56:01,085][DEBUG][org.logstash.execution.AbstractPipelineExt] Flow metric registered: `worker_millis_per_event` in namespace `[:stats, :pipelines, :main, :plugins, :filters, :e1ef443fd589f441fa6c6acf8f0a9648df6069b979a3451fd54adcef323e112d, :flow]`
[2023-12-05T10:56:01,086][DEBUG][org.logstash.execution.AbstractPipelineExt] Flow metric registered: `worker_utilization` in namespace `[:stats, :pipelines, :main, :plugins, :filters, :e1ef443fd589f441fa6c6acf8f0a9648df6069b979a3451fd54adcef323e112d, :flow]`
[2023-12-05T10:56:01,086][DEBUG][org.logstash.execution.AbstractPipelineExt] Flow metric registered: `worker_millis_per_event` in namespace `[:stats, :pipelines, :main, :plugins, :outputs, :aca75b1c386c2665d4cb58cc3ffa2e29cfe7e80449ada242df6b897831b38371, :flow]`
[2023-12-05T10:56:01,087][DEBUG][org.logstash.execution.AbstractPipelineExt] Flow metric registered: `worker_utilization` in namespace `[:stats, :pipelines, :main, :plugins, :outputs, :aca75b1c386c2665d4cb58cc3ffa2e29cfe7e80449ada242df6b897831b38371, :flow]`
[2023-12-05T10:56:01,087][DEBUG][logstash.javapipeline    ] Starting pipeline {:pipeline_id=>"main"}
[2023-12-05T10:56:01,092][DEBUG][logstash.filters.csv     ][main] CSV parsing options {:col_sep=>",", :quote_char=>"\""}
[2023-12-05T10:56:01,094][INFO ][logstash.filters.csv     ][main] ECS compatibility is enabled but `target` option was not specified. This may cause fields to be set at the top-level of the event where they are likely to clash with the Elastic Common Schema. It is recommended to set the `target` option to avoid potential schema conflicts (if your data is ECS compliant or non-conflicting, feel free to ignore this message)
[2023-12-05T10:56:01,107][INFO ][logstash.javapipeline    ][main] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>16, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>2000, "pipeline.sources"=>["E:/logstash-8.11.1/config/logstash.conf"], :thread=>"#<Thread:0x18ecd44e E:/logstash-8.11.1/logstash-core/lib/logstash/java_pipeline.rb:134 run>"}
[2023-12-05T10:56:01,650][INFO ][logstash.javapipeline    ][main] Pipeline Java execution initialization time {"seconds"=>0.54}
[2023-12-05T10:56:01,661][INFO ][logstash.inputs.file     ][main] No sincedb_path set, generating one based on the "path" setting {:sincedb_path=>"E:/logstash-8.11.1/data/plugins/inputs/file/.sincedb_ce726a7475d7c755e8112b4728d8fbd8", :path=>["Z:/5/upwork/ctf/ThreatHound-main/sysmon1.csv"]}
[2023-12-05T10:56:01,662][INFO ][logstash.javapipeline    ][main] Pipeline started {"pipeline.id"=>"main"}
[2023-12-05T10:56:01,666][DEBUG][logstash.javapipeline    ] Pipeline started successfully {:pipeline_id=>"main", :thread=>"#<Thread:0x18ecd44e E:/logstash-8.11.1/logstash-core/lib/logstash/java_pipeline.rb:134 run>"}
[2023-12-05T10:56:01,667][INFO ][filewatch.observingtail  ][main][7b1180756692affca0042711147933de5961a6fe3eba697acd9b7a0edcd9ca2f] START, creating Discoverer, Watch with file and sincedb collections
[2023-12-05T10:56:01,671][INFO ][logstash.agent           ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[2023-12-05T10:56:01,674][DEBUG][org.logstash.execution.PeriodicFlush][main] Pushing flush onto pipeline.
[2023-12-05T10:56:01,678][DEBUG][filewatch.sincedbcollection][main][7b1180756692affca0042711147933de5961a6fe3eba697acd9b7a0edcd9ca2f] open: reading from E:/logstash-8.11.1/data/plugins/inputs/file/.sincedb_ce726a7475d7c755e8112b4728d8fbd8
[2023-12-05T10:56:02,682][DEBUG][filewatch.sincedbcollection][main][7b1180756692affca0042711147933de5961a6fe3eba697acd9b7a0edcd9ca2f] writing sincedb (delta since last write = 1701770162)
[2023-12-05T10:56:05,075][DEBUG][logstash.instrument.periodicpoller.cgroup] One or more required cgroup files or directories not found: /proc/self/cgroup, /sys/fs/cgroup/cpuacct, /sys/fs/cgroup/cpu
[2023-12-05T10:56:05,262][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"G1 Young Generation"}
[2023-12-05T10:56:05,262][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"G1 Old Generation"}
[2023-12-05T10:56:06,674][DEBUG][org.logstash.execution.PeriodicFlush][main] Pushing flush onto pipeline.
[2023-12-05T10:56:10,079][DEBUG][logstash.instrument.periodicpoller.cgroup] One or more required cgroup files or directories not found: /proc/self/cgroup, /sys/fs/cgroup/cpuacct, /sys/fs/cgroup/cpu
[2023-12-05T10:56:10,287][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"G1 Young Generation"}
[2023-12-05T10:56:10,288][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"G1 Old Generation"}
[2023-12-05T10:56:11,679][DEBUG][org.logstash.execution.PeriodicFlush][main] Pushing flush onto pipeline.
[2023-12-05T10:56:15,088][DEBUG][logstash.instrument.periodicpoller.cgroup] One or more required cgroup files or directories not found: /proc/self/cgroup, /sys/fs/cgroup/cpuacct, /sys/fs/cgroup/cpu
[2023-12-05T10:56:15,327][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"G1 Young Generation"}
[2023-12-05T10:56:15,328][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"G1 Old Generation"}
[2023-12-05T10:56:16,678][DEBUG][org.logstash.execution.PeriodicFlush][main] Pushing flush onto pipeline.
[2023-12-05T10:56:17,766][DEBUG][filewatch.sincedbcollection][main][7b1180756692affca0042711147933de5961a6fe3eba697acd9b7a0edcd9ca2f] writing sincedb (delta since last write = 15)
[2023-12-05T10:56:20,094][DEBUG][logstash.instrument.periodicpoller.cgroup] One or more required cgroup files or directories not found: /proc/self/cgroup, /sys/fs/cgroup/cpuacct, /sys/fs/cgroup/cpu
[2023-12-05T10:56:20,352][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"G1 Young Generation"}
[2023-12-05T10:56:20,353][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"G1 Old Generation"}
[2023-12-05T10:56:21,670][DEBUG][org.logstash.execution.PeriodicFlush][main] Pushing flush onto pipeline.
[2023-12-05T10:56:25,100][DEBUG][logstash.instrument.periodicpoller.cgroup] One or more required cgroup files or directories not found: /proc/self/cgroup, /sys/fs/cgroup/cpuacct, /sys/fs/cgroup/cpu
[2023-12-05T10:56:25,392][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"G1 Young Generation"}
[2023-12-05T10:56:25,393][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"G1 Old Generation"}
[2023-12-05T10:56:26,670][DEBUG][org.logstash.execution.PeriodicFlush][main] Pushing flush onto pipeline.
[2023-12-05T10:56:30,104][DEBUG][logstash.instrument.periodicpoller.cgroup] One or more required cgroup files or directories not found: /proc/self/cgroup, /sys/fs/cgroup/cpuacct, /sys/fs/cgroup/cpu
[2023-12-05T10:56:30,415][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"G1 Young Generation"}
[2023-12-05T10:56:30,415][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"G1 Old Generation"}
[2023-12-05T10:56:31,669][DEBUG][org.logstash.execution.PeriodicFlush][main] Pushing flush onto pipeline.
[2023-12-05T10:56:32,788][DEBUG][filewatch.sincedbcollection][main][7b1180756692affca0042711147933de5961a6fe3eba697acd9b7a0edcd9ca2f] writing sincedb (delta since last write = 15)
[2023-12-05T10:56:35,108][DEBUG][logstash.instrument.periodicpoller.cgroup] One or more required cgroup files or directories not found: /proc/self/cgroup, /sys/fs/cgroup/cpuacct, /sys/fs/cgroup/cpu
[2023-12-05T10:56:35,440][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"G1 Young Generation"}
[2023-12-05T10:56:35,441][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"G1 Old Generation"}
[2023-12-05T10:56:36,669][DEBUG][org.logstash.execution.PeriodicFlush][main] Pushing flush onto pipeline.
[2023-12-05T10:56:40,110][DEBUG][logstash.instrument.periodicpoller.cgroup] One or more required cgroup files or directories not found: /proc/self/cgroup, /sys/fs/cgroup/cpuacct, /sys/fs/cgroup/cpu
[2023-12-05T10:56:40,273][DEBUG]

Please, help me.
Thanks.

Please do not post pictures of text, they are not searchable, and some folks cannot see them. Just post the text.

Most likely your LS already read file and keep track as a processed file.

  1. Remove e:\logstash-8.11.1\data\plugins\inputs\file.sincedb_ce.*
  2. Add line:
input {
  file{
  	sincedb_path => "NUL"
  	type =>"csv"
...

Read here about sincedb.

PS. Half of mine post are related to sincedb :scream:

I solved with your help.
thanks.

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.