Logstash not pushing logs to loki

Below is my logstash config

input {
    file {
        ecs_compatibility => disabled
        path => [
            "/a/logs/project_apps/**/*.log"
        ]
        start_position => beginning
        exclude => [
        ]
    }
}
filter {
    
    # generate valid timestamp
    ruby {
        code => '
            t = Time.at(event.get("@timestamp").to_f)
            event.set("@timestamp", LogStash::Timestamp.new(t.strftime("%FT%T.%3NZ")))
        '
    }
    
    grok {
        ecs_compatibility => disabled
        match => { "host" => "a%{GREEDYDATA:ip_addr}.d" }
    }
    grok {
        ecs_compatibility => disabled
        match => { "path" => "/a/logs/project_apps/%{DATA:app}__%{DATA:version}/%{GREEDYDATA:filename}" }
        match => { "path" => "/a/logs/project_apps/%{DATA:app}/%{GREEDYDATA:filename}" }
    }
    mutate {
        update => { "host" => "X" }
        gsub => [
            'ip_addr', '-', '.',
            'filename', '(.+__)+.+?[-_]', '',
            'filename', '/', '-_-'
        ]
        remove_field => [ "@version" ]
        add_field => {
                 "cluster" => "new"
                 "job" => "logstash"
        }
    }

    if [version] !~ /.+/ {
        mutate {
            add_field => { "version" => 'unknown' }
        }
    }

    if [message] =~ /.{1000,}/ {
        truncate {
            fields => "message"
            length_bytes => 1000
            add_tag => [ "truncated_msg" ]
        }
    }
}
output {
    file {
        ecs_compatibility => disabled
        path => "/a/project/var/run/_logstash/logstash_out"
    }
    loki {
       url => "http://grafana.custom.com/loki/api/v1/push"
       message_field => "message"
       insecure_skip_verify => true
    }
}

Below are my logstash logs

[2023-09-27T18:16:54,096][INFO ][logstash.runner          ] Log4j configuration path used is: /usr/local/abproject/logstash/config/log4j2.properties
[2023-09-27T18:16:54,101][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"8.4.1", "jruby.version"=>"jruby 9.3.6.0 (2.6.8) 2022-06-27 7a2cbcd376 OpenJDK 64-Bit Server VM 11.0.18+10 on 11.0.18+10 +indy +jit [x86_64-linux]"}
[2023-09-27T18:16:54,104][INFO ][logstash.runner          ] JVM bootstrap flags: [-Xms1g, -Xmx1g, -XX:+UseConcMarkSweepGC, -XX:CMSInitiatingOccupancyFraction=75, -XX:+UseCMSInitiatingOccupancyOnly, -Djava.awt.headless=true, -Dfile.encoding=UTF-8, -Djruby.compile.invokedynamic=true, -Djruby.jit.threshold=0, -XX:+HeapDumpOnOutOfMemoryError, -Djava.security.egd=file:/dev/urandom, -Dlog4j2.isThreadContextMapInheritable=true, -XX:-UseSerialGC, -Djava.awt.headless=true, -XX:-HeapDumpOnOutOfMemoryError, -XX:HeapDumpPath=/usr/local/abproject/logstash/heapdump.hprof, -Djruby.regexp.interruptible=true, -Djdk.io.File.enableADS=true, --add-opens=java.base/java.security=ALL-UNNAMED, --add-opens=java.base/java.io=ALL-UNNAMED, --add-opens=java.base/java.nio.channels=ALL-UNNAMED, --add-opens=java.base/sun.nio.ch=ALL-UNNAMED, --add-opens=java.management/sun.management=ALL-UNNAMED]
[2023-09-27T18:16:54,423][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2023-09-27T18:16:55,823][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600, :ssl_enabled=>false}
[2023-09-27T18:16:57,303][INFO ][org.reflections.Reflections] Reflections took 111 ms to scan 1 urls, producing 125 keys and 434 values
[2023-09-27T18:16:58,141][INFO ][logstash.codecs.jsonlines] ECS compatibility is enabled but `target` option was not specified. This may cause fields to be set at the top-level of the event where they are likely to clash with the Elastic Common Schema. It is recommended to set the `target` option to avoid potential schema conflicts (if your data is ECS compliant or non-conflicting, feel free to ignore this message)
[2023-09-27T18:16:58,212][INFO ][logstash.javapipeline    ] Pipeline `main` is configured with `pipeline.ecs_compatibility: v8` setting. All plugins in this pipeline will default to `ecs_compatibility => v8` unless explicitly configured otherwise.
[2023-09-27T18:16:58,291][INFO ][logstash.outputs.loki    ][main] Loki output plugin {:class=>"LogStash::Outputs::Loki"}
[2023-09-27T18:16:58,520][WARN ][logstash.javapipeline    ][main] 'pipeline.ordered' is enabled and is likely less efficient, consider disabling if preserving event order is not necessary
[2023-09-27T18:16:58,603][INFO ][logstash.javapipeline    ][main] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>1, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>125, "pipeline.sources"=>["/a/project/etc/custlogfolder/ab_logstash.cfg"], :thread=>"#<Thread:0x2706330d run>"}
[2023-09-27T18:16:59,430][INFO ][logstash.javapipeline    ][main] Pipeline Java execution initialization time {"seconds"=>0.82}
[2023-09-27T18:16:59,492][INFO ][logstash.inputs.file     ][main] No sincedb_path set, generating one based on the "path" setting {:sincedb_path=>"/usr/local/abproject/project/var/run/_logstash/data/plugins/inputs/file/.sincedb_446674139e74f5602e20ea9bf75160ee", :path=>["/a/logs/project_apps/**/*.log"]}
[2023-09-27T18:16:59,515][INFO ][logstash.javapipeline    ][main] Pipeline started {"pipeline.id"=>"main"}
[2023-09-27T18:16:59,568][INFO ][filewatch.observingtail  ][main][df50bb6e47317e3b25a2ea1a1810837e8367e42a3b53dea3bc5a836f9f478935] START, creating Discoverer, Watch with file and sincedb collections
[2023-09-27T18:16:59,649][INFO ][logstash.agent           ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[2023-09-27T18:17:00,654][INFO ][logstash.outputs.file    ][main][1fdb126e23818d7f1a10425d52a9685fccda2b8f490a81959f8becb0eb629ff1] Opening file {:path=>"/a/project/var/run/_logstash/logstash_out"}

I'm trying to send logs from logstash to loki (Deployed using helm charts) but it's not happening. I just wanted to know if the log line [2023-09-27T18:16:58,291][INFO ][logstash.outputs.loki ][main] Loki output plugin {:class=>"LogStash::Outputs::Loki"} is all we get for loki setup or should there be anything else will be seen in logs ?? Also i just wanted to know if i'm missing any config ?

Do you have any output in this file? Please share some sample messages.

@leandrojmp Yes, logs are being generated to logstash_out file. Below are sample logs.

{"message":"  File \"/a/project/src-ww/test_app1/utils/scm.py\", line 14, in get_p4_client","app":"test-app1","@timestamp":"2023-09-28T06:25:25.166Z","job":"logstash","version":"unknown","path":"/a/logs/test_apps/test-app1/test-app1-test-app1-sync-stdout.log","cluster":"new","filename":"test-app1-test-app1-sync-stdout.log","host":"X","event":{"original":"  File \"/a/project/src-ww/test_app1/utils/scm.py\", line 14, in get_p4_client"},"ip_addr":"123.45.6.789"}
{"message":"    p4_client.connect()","app":"test-app1","@timestamp":"2023-09-28T06:25:25.166Z","job":"logstash","version":"unknown","path":"/a/logs/test_apps/test-app1/test-app1-test-app1-sync-stdout.log","cluster":"new","filename":"test-app1-test-app1-sync-stdout.log","host":"X","event":{"original":"    p4_client.connect()"},"ip_addr":"123.45.6.789"}
{"message":"  File \"/a/lib/python3.9/site-packages/P4.py\", line 807, in connect","app":"test-app1","@timestamp":"2023-09-28T06:25:25.166Z","job":"logstash","version":"unknown","path":"/a/logs/test_apps/test-app1/test-app1-test-app1-sync-stdout.log","cluster":"new","filename":"test-app1-test-app1-sync-stdout.log","host":"X","event":{"original":"  File \"/a/lib/python3.9/site-packages/P4.py\", line 807, in connect"},"ip_addr":"123.45.6.789"}
{"message":"    P4API.P4Adapter.connect( self )","app":"test-app1","@timestamp":"2023-09-28T06:25:25.166Z","job":"logstash","version":"unknown","path":"/a/logs/test_apps/test-app1/test-app1-test-app1-sync-stdout.log","cluster":"new","filename":"test-app1-test-app1-sync-stdout.log","host":"X","event":{"original":"    P4API.P4Adapter.connect( self )"},"ip_addr":"123.45.6.789"}

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.