Logstash is not pushing logs to Azure

Hello,

I am using Logstash 7.9 on Ubuntu 18.04 LTS with OpenJDK 11. I configured an Azure Log Analytics plugin with a configuration file which works on logstash 2.3.

Logstash seems to start well but no logs are pushed and logstash logs are not moving after logstash startup :

[2020-09-10T16:09:14,235][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"7.9.1", "jruby.version"=>"jruby 9.2.13.0 (2.5.7) 2020-08-03 9a89c94bcc OpenJDK 64-Bit Server VM 11.0.8+10-post-Ubuntu-0ubuntu118.04.1 on 11.0.8+10-post-Ubuntu-0ubuntu118.04.1 +indy +jit [linux-x86_64]"}
[2020-09-10T16:09:21,405][INFO ][org.reflections.Reflections] Reflections took 144 ms to scan 1 urls, producing 22 keys and 45 values
[2020-09-10T16:09:23,494][INFO ][logstash.outputs.azureloganalytics] Using version 0.1.x output plugin 'azure_loganalytics'. This plugin isn't well supported by the community and likely has no maintainer.
[2020-09-10T16:09:25,307][INFO ][logstash.javapipeline    ][main] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>2, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>250, "pipeline.sources"=>["/etc/logstash/conf.d/openidmConfAudit.conf", "/etc/logstash/conf.d/openidmConfDebug.conf"], :thread=>"#<Thread:0x2dc36870 run>"}
[2020-09-10T16:09:27,632][INFO ][logstash.javapipeline    ][main] Pipeline Java execution initialization time {"seconds"=>2.27}
[2020-09-10T16:09:28,183][INFO ][logstash.inputs.file     ][main] No sincedb_path set, generating one based on the "path" setting {:sincedb_path=>"/var/lib/logstash/plugins/inputs/file/.sincedb_df3821128bb03d7cf4b885b983ebd6fa", :path=>["/data/var/log/openidm/audit/activity.csv"]}
[2020-09-10T16:09:28,220][INFO ][logstash.inputs.file     ][main] No sincedb_path set, generating one based on the "path" setting {:sincedb_path=>"/var/lib/logstash/plugins/inputs/file/.sincedb_d9139ba65fc719ab6eaabcbf943283b3", :path=>["/data/var/log/openidm/audit/access.csv"]}
[2020-09-10T16:09:28,242][INFO ][logstash.inputs.file     ][main] No sincedb_path set, generating one based on the "path" setting {:sincedb_path=>"/var/lib/logstash/plugins/inputs/file/.sincedb_66c51dadb9ce448f310995b88e8d639c", :path=>["/data/var/log/openidm/audit/authentication.csv"]}
[2020-09-10T16:09:28,252][INFO ][logstash.inputs.file     ][main] No sincedb_path set, generating one based on the "path" setting {:sincedb_path=>"/var/lib/logstash/plugins/inputs/file/.sincedb_edb10ded5868f7605b0091db59114034", :path=>["/data/var/log/openidm/audit/config.csv"]}
[2020-09-10T16:09:28,265][INFO ][logstash.inputs.file     ][main] No sincedb_path set, generating one based on the "path" setting {:sincedb_path=>"/var/lib/logstash/plugins/inputs/file/.sincedb_e5ff522f131972e40906488eb2259d24", :path=>["/data/var/log/openidm/audit/recon.csv"]}
[2020-09-10T16:09:28,294][INFO ][logstash.inputs.file     ][main] No sincedb_path set, generating one based on the "path" setting {:sincedb_path=>"/var/lib/logstash/plugins/inputs/file/.sincedb_e8712b0c6f42323c1d6d4fe4a0c78b4f", :path=>["/data/var/log/openidm/audit/sync.csv"]}
[2020-09-10T16:09:28,304][INFO ][logstash.inputs.file     ][main] No sincedb_path set, generating one based on the "path" setting {:sincedb_path=>"/var/lib/logstash/plugins/inputs/file/.sincedb_624126a7a61ec59a41bff6d9fd6cfde0", :path=>["/data/var/log/openidm/debug/openidm0.log"]}
[2020-09-10T16:09:28,373][INFO ][logstash.javapipeline    ][main] Pipeline started {"pipeline.id"=>"main"}
[2020-09-10T16:09:28,448][INFO ][filewatch.observingtail  ][main][2051a634b49ff1f3b0b45b8bb03376acd94f7317ee3d025d2b84bf0940cce281] START, creating Discoverer, Watch with file and sincedb collections
[2020-09-10T16:09:28,448][INFO ][filewatch.observingtail  ][main][148cbffcaf0c1f771f3688db63b50c3e53d7579c22d808a57f65953a280d90a9] START, creating Discoverer, Watch with file and sincedb collections
[2020-09-10T16:09:28,457][INFO ][filewatch.observingtail  ][main][b18b4655fcfb20ad87128c25d4fa87345097c5198459582bdeddcdcda2240b13] START, creating Discoverer, Watch with file and sincedb collections
[2020-09-10T16:09:28,432][INFO ][filewatch.observingtail  ][main][ff066052ed43c4968187aa7a30e51db907125c6e697b6770008fc7f4b520aa20] START, creating Discoverer, Watch with file and sincedb collections
[2020-09-10T16:09:28,448][INFO ][filewatch.observingtail  ][main][4a6c03293566e5d0df8261fe28cc723d4cb05983332894fc8d98bb884978cc3c] START, creating Discoverer, Watch with file and sincedb collections
[2020-09-10T16:09:28,451][INFO ][filewatch.observingtail  ][main][c7d20440d94c8fb3c3380f257502f7dcde244a6c85e737d726dc2da5eac97711] START, creating Discoverer, Watch with file and sincedb collections
[2020-09-10T16:09:28,457][INFO ][filewatch.observingtail  ][main][3da699bc620c5d2b3c6807a3c792d5c640b5a899bc3a3bbaac960d6506bcc2f2] START, creating Discoverer, Watch with file and sincedb collections
[2020-09-10T16:09:28,727][INFO ][logstash.agent           ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[2020-09-10T16:09:29,040][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}

My logs are on a datadrive with read rights for logstash user, I do not understand what is wrong. Thanks in advance for the help.

What does your file input configuration look like?

Here is one of my configuration files (debug logs) I show only a Jinja template of the file :

 # Input processing
input {
# AM
    file {
        type => "idmDebug"
        start_position => "beginning"
        path => [ "{{ idm_system_logging_base_dir }}/openidm/debug/openidm0.log" ]
        codec => multiline {
            pattern => "^%{MONTH}"
	    negate => true
            what => "previous"
        }
    }
}
 
# Filter processing
filter {
    if [type] == "idmDebug" {
 
        grok {
            match => {
                "message" => "%{MONTH:month} %{MONTHDAY:monthday}, %{YEAR:year} %{TIME:time} %{WORD:daytime} (?<details>(.|\r|\n)*)"
            }
            remove_field => [ "message" ]
        }
 
    
#date format: Jul 10, 2018 8:34:48.186 AM
        mutate {
            add_field => { "formattedTime" => "%{month} %{monthday}, %{year} %{time} %{daytime}"}
            remove_field => [ "month", "monthday", "year", "time", "daytime"]
        }

        date {
            match => ["formattedTime", "MMM dd, yyyy hh:mm:ss aa", "MMM d, yyyy hh:mm:ss aa", "MMM dd, yyyy h:mm:ss aa", "MMM d, yyyy h:mm:ss aa"]
            target => "logTimestamp"
            remove_field => [ "formattedTime" ]
        }
    }
}
 
#output processing
output {

	if [type] == "idmDebug" {
		azure_loganalytics {
             		customer_id => "{{ azure_customer_id  }}"
            		shared_key => "{{  azure_shared_key }}"
			log_type => "openidmLogDebug"
			time_generated_field => "logTimestamp"
       		}
        }
}

EDIT : I am still facing the issue. I ensures logs can be read by logstash an restarted the service but nothing happens, and I see nothing wrong in logs ...

chmod -R 755 /data/var/log/openidm/debug/openidm0.log.0
chmod -R 755 /data/var/log/openidm/audit/*.csv
service logstash restart
tail -f /var/log/logstash/logstash-plain.log

SOLUTION :
Ok, this was an issue on the rights I have put on my logs, I needed to set execute right on all the parent folders of my logs files so that logstash user can list them :

chmod -R 755 /data

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.