Logstash not reading my config

I have my config under /etc/logstash/conf.d/myconfig.conf. Below is my simple config

input {
  file {
    path => "/home/foo/logs/*.log"
    start_position => "beginning"
#    stat_interval => 1
#    discover_interval => 5
  }
}
filter {
  grok {
    match => { "message" => "^(?<starttime>[^ ]*) (?<service>[^ ]+)\/\w+ (?<endtime>[^ ]* [^ ]*) \[(?<loglevel>\w*).+\] (?<data>.*$)" }
  }
}

output {
    kusto {
            path => "/tmp/kusto/%{+YYYY-MM-dd-HH-mm}.txt"
            ingest_url => "https://ingest-adxadmdev.eastus.kusto.windows.net/"
            app_id => "secret"
            app_key => "more secret"
            app_tenant => "othersecret"
            database => "mydb"
            table => "mytable"
            json_mapping => "basicmsg"
    }
}

I set the log level to debug, the output is as follow.

[2023-03-30T11:07:53,883][DEBUG][org.logstash.config.ir.CompiledPipeline][main] Compiled output
 P[output-kusto{"path"=>"/tmp/kusto/%{+YYYY-MM-dd-HH-mm}.txt", "ingest_url"=>"https://ingest-<db>.eastus.kusto.windows.net/", "app_id"=>"secret", "app_key"=>"more_secret", "app_tenant"=>"someplace", "database"=>"mydb", "table"=>"mytable", "json_mapping"=>"basicmsg"}|[file]/etc/logstash/conf.d/logstash_cust.con
f:24:5:```
kusto {
            path => "/tmp/kusto/%{+YYYY-MM-dd-HH-mm}.txt"
            ingest_url => "https://ingest-<db>.eastus.kusto.windows.net/"
            app_id => "secret"
            app_key => "more_secret"
            app_tenant => "someplace"
            database => "mydb"
            table => "mytable"
            json_mapping => "basicmsg"
    }
```] 
 into 
 org.logstash.config.ir.compiler.ComputeStepSyntaxElement@3fdc2b9c
[2023-03-30T11:07:53,902][INFO ][logstash.javapipeline    ][main] Pipeline Java execution initialization time {"seconds"=>0.55}
[2023-03-30T11:07:53,912][INFO ][logstash.inputs.file     ][main] No sincedb_path set, generating one based on the "path" setting {:sincedb_path=>"/var/lib/logstash/plugins/inputs/file/.sincedb_8af259e154d61
1ac8e2110921cf5729f", :path=>["/home/foo/logs/*.log"]}
[2023-03-30T11:07:53,915][INFO ][logstash.javapipeline    ][main] Pipeline started {"pipeline.id"=>"main"}
[2023-03-30T11:07:53,920][INFO ][filewatch.observingtail  ][main][4267a5b0577bed96600640a9454b1d6e43af02133e2254598b206201a0fad4ce] START, creating Discoverer, Watch with file and sincedb collections
[2023-03-30T11:07:53,920][DEBUG][org.logstash.execution.PeriodicFlush][main] Pushing flush onto pipeline.
[2023-03-30T11:07:53,921][INFO ][filewatch.observingtail  ][main][a2076e83890d83f054f1acffeee8a9505d96d9ca8f18c6dcf2676a1b803a7e2a] START, creating Discoverer, Watch with file and sincedb collections
[2023-03-30T11:07:53,923][DEBUG][logstash.javapipeline    ] Pipeline started successfully {:pipeline_id=>"main", :thread=>"#<Thread:0x376f3ece@/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:
131 run>"}
[2023-03-30T11:07:53,923][DEBUG][filewatch.sincedbcollection][main][a2076e83890d83f054f1acffeee8a9505d96d9ca8f18c6dcf2676a1b803a7e2a] open: reading from /var/lib/logstash/plugins/inputs/file/.sincedb_8af259e
154d611ac8e2110921cf5729f
[2023-03-30T11:07:53,925][DEBUG][filewatch.sincedbcollection][main][4267a5b0577bed96600640a9454b1d6e43af02133e2254598b206201a0fad4ce] open: reading from /var/log/logstash/log.db
[2023-03-30T11:07:53,929][INFO ][logstash.agent           ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[2023-03-30T11:07:54,024][DEBUG][com.microsoft.aad.msal4j.ConfidentialClientApplication] [Correlation ID: 364ff9f2-3d3c-4714-bd0f-be59c5160ad4] Access Token was returned
[2023-03-30T11:07:54,024][DEBUG][com.microsoft.aad.msal4j.ConfidentialClientApplication] [Correlation ID: cf17d79a-52d0-417c-972c-c04705c828dc] Access Token was returned
[2023-03-30T11:07:54,118][INFO ][com.microsoft.azure.kusto.ingest.ResourceManager][main] Refreshing Ingestion Resources
[2023-03-30T11:07:54,120][DEBUG][com.microsoft.aad.msal4j.ConfidentialClientApplication] [Correlation ID: 2d851cc7-249e-4632-a557-3336d03dfc7f] Access Token was returned
[2023-03-30T11:07:54,154][INFO ][com.microsoft.azure.kusto.ingest.ResourceManager][main] Refreshing Ingestion Resources Finised
[2023-03-30T11:07:54,160][INFO ][com.microsoft.azure.kusto.ingest.ResourceManager][main] Refreshing Ingestion Resources
[2023-03-30T11:07:54,161][DEBUG][com.microsoft.aad.msal4j.ConfidentialClientApplication] [Correlation ID: 8ea74739-262e-48d2-ba0d-09095aa67eee] Access Token was returned
[2023-03-30T11:07:54,190][INFO ][com.microsoft.azure.kusto.ingest.ResourceManager][main] Refreshing Ingestion Resources Finised
[2023-03-30T11:07:54,926][DEBUG][filewatch.sincedbcollection][main][a2076e83890d83f054f1acffeee8a9505d96d9ca8f18c6dcf2676a1b803a7e2a] writing sincedb (delta since last write = 1680188874)
[2023-03-30T11:07:54,928][DEBUG][filewatch.sincedbcollection][main][4267a5b0577bed96600640a9454b1d6e43af02133e2254598b206201a0fad4ce] writing sincedb (delta since last write = 1680188874)
[2023-03-30T11:07:54,986][DEBUG][logstash.outputs.kusto   ][main] Starting flush cycle
[2023-03-30T11:07:55,195][DEBUG][logstash.outputs.kusto   ][main] Starting flush cycle
[2023-03-30T11:07:56,369][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"G1 Young Generation"}
[2023-03-30T11:07:56,369][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"G1 Old Generation"}
[2023-03-30T11:07:56,987][DEBUG][logstash.outputs.kusto   ][main] Starting flush cycle
[2023-03-30T11:07:57,196][DEBUG][logstash.outputs.kusto   ][main] Starting flush cycle
[2023-03-30T11:07:58,920][DEBUG][org.logstash.execution.PeriodicFlush][main] Pushing flush onto pipeline.
[2023-03-30T11:07:58,987][DEBUG][logstash.outputs.kusto   ][main] Starting flush cycle
[2023-03-30T11:07:59,197][DEBUG][logstash.outputs.kusto   ][main] Starting flush cycle
[2023-03-30T11:08:00,988][DEBUG][logstash.outputs.kusto   ][main] Starting flush cycle
[2023-03-30T11:08:01,198][DEBUG][logstash.outputs.kusto   ][main] Starting flush cycle
[2023-03-30T11:08:01,384][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"G1 Young Generation"}
[2023-03-30T11:08:01,384][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"G1 Old Generation"}
[2023-03-30T11:08:02,988][DEBUG][logstash.outputs.kusto   ][main] Starting flush cycle
[2023-03-30T11:08:03,198][DEBUG][logstash.outputs.kusto   ][main] Starting flush cycle
[2023-03-30T11:08:03,920][DEBUG][org.logstash.execution.PeriodicFlush][main] Pushing flush onto pipeline.
[2023-03-30T11:08:03,954][DEBUG][logstash.outputs.kusto   ][main][150f7d335a287290b1e47b8cdca6e9e27e7fcc85e6a6218b7153a910d230ad6e] Starting stale files cleanup cycle {:files=>{}}
[2023-03-30T11:08:03,955][DEBUG][logstash.outputs.kusto   ][main][150f7d335a287290b1e47b8cdca6e9e27e7fcc85e6a6218b7153a910d230ad6e] 0 stale files found {:inactive_files=>{}}
[2023-03-30T11:08:03,955][DEBUG][logstash.outputs.kusto   ][main][efa8fe5c7aa19d892d4124a08d183094b48533e6a783327294e7d0176fbab0ad] Starting stale files cleanup cycle {:files=>{}}
[2023-03-30T11:08:03,955][DEBUG][logstash.outputs.kusto   ][main][efa8fe5c7aa19d892d4124a08d183094b48533e6a783327294e7d0176fbab0ad] 0 stale files found {:inactive_files=>{}}
[2023-03-30T11:08:04,988][DEBUG][logstash.outputs.kusto   ][main] Starting flush cycle
[2023-03-30T11:08:05,199][DEBUG][logstash.outputs.kusto   ][main] Starting flush cycle
[2023-03-30T11:08:06,407][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"G1 Young Generation"}
[2023-03-30T11:08:06,408][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"G1 Old Generation"}
[2023-03-30T11:08:06,989][DEBUG][logstash.outputs.kusto   ][main] Starting flush cycle
[2023-03-30T11:08:07,200][DEBUG][logstash.outputs.kusto   ][main] Starting flush cycle

the issue i am facing is if i run logstash -f /etc/logstash/conf.d/myconfig.conf i can see the log ingested, but when i run logstash as service, nothing is pushed.

I am using centos 7.9, logstash 8.6.2.

Which user did you used when you ran logstash using the CLI?

When you run Logstash as a service it will run as the logstash user, since you are using a file input, does the logstash user have permission to read the file specified in the path?

I strongly recommend that you do not use paths inside /home/ directories, but use another path where you can safely change the permissions to allow the logstash user to read it.

nice catch. i ran CLI as root, which will read the logs. the issue is that I have logs under /home/foo . do i need to give logstash user access to /home or better just mount those logs as a docker volume which will have root access ?

I think i will go with docker-compose and mount the logs as volume.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.