.logstash_jdbc_last_run not created when running logstash as a service

Hi,
My problem is when I run logstash as a service.
I'm working with centos6 and ELK 7.9.2
For a same config file with logstash-input-jdbc, evrything works fine when running logstash with the command line :

/usr/share/logstash/bin/logstash -f /etc/logstash/conf.d/logstash_stat_EC_JDBC.conf

But when it comes to logstash service, the file ".logstash_jdbc_last_run" is not at all created, the other parameters like schedule works fine.
So when restarting the service, it starts from the begining and not from the last_run.
Could anyone please help me with this ?
My config file here :

input {
  jdbc {
    jdbc_driver_library => "/usr/share/logstash/logstash-core/lib/jars/postgresql-42.2.6.jar"
    jdbc_driver_class => "org.postgresql.Driver"
    jdbc_connection_string => "jdbc:postgresql://*****:5432/***"
    jdbc_user => "****"
    jdbc_password => "*****"
    jdbc_validate_connection => true
    tracking_column => "unix_ts_in_secs"
    use_column_value => true
    tracking_column_type => "numeric"
    schedule => "*/5 * * * *"
    statement_filepath => "/etc/logstash/conf.d/requete_EC_JDBC.sql"

  }
}

filter {
        mutate { remove_field => ["@version", "message", "host", "path", "_index", "_type", "_id", "_score", "tags", "unix_ts_in_secs"] }
        mutate { convert => {
                "codepubliablemessage" => "string"
                "messageid" => "string"
                "traitementauto" => "boolean"
                }
        }

        date {
                match => [ "datefindemande", "yyyy-MM-dd HH:mm:ss.SSS", "yyyy-MM-dd HH:mm:ss", "yyyy-MM-dd HH:mm:ss.SS", "yyyy-MM-dd HH:mm:ss.S", "yyyy-MM-dd HH:mm:ss.SSSSSS"]
                target => "datefindemande"
        }

        date {
                match => [ "datederniermessage", "yyyy-MM-dd HH:mm:ss.SSS", "yyyy-MM-dd HH:mm:ss", "yyyy-MM-dd HH:mm:ss.SS", "yyyy-MM-dd HH:mm:ss.S", "yyyy-MM-dd HH:mm:ss.SSSSSS"]
                target => "datederniermessage"
        }
}


output {
        elasticsearch {
                hosts => ["localhost:9200"]
                user => "******"
                index => "table_merge_jdbc"
                password => "*********"
        }

        stdout {
                codec => rubydebug
        }

}

Do you have anything in Logstash logs referencing .logstash_jdbc_last_run ?

Did you run Logstash from the CLI as the root user or using sudo ?

"$HOME/.logstash_jdbc_last_run" doesn't exist at all. But i don't have any problem with unix_ts_in_secs getting updated until i restart the service.
I'm running the service using sudo : sudo initctl start logstash

And nothing refrencing .logstash_jdbc_last_run in logstash logs : /var/log/logstash/logstash-plain.log
Here is my log file :

[2022-08-26T14:18:12,486][INFO ][logstash.outputs.elasticsearch][main] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://logstash_internal:xxxxxx@localhost:9200/]}}
[2022-08-26T14:18:12,915][WARN ][logstash.outputs.elasticsearch][main] Restored connection to ES instance {:url=>"http://logstash_internal:xxxxxx@localhost:9200/"}
[2022-08-26T14:18:12,975][INFO ][logstash.outputs.elasticsearch][main] ES Output version determined {:es_version=>7}
[2022-08-26T14:18:12,980][WARN ][logstash.outputs.elasticsearch][main] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>7}
[2022-08-26T14:18:13,067][INFO ][logstash.outputs.elasticsearch][main] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//localhost:9200"]}
[2022-08-26T14:18:13,214][INFO ][logstash.outputs.elasticsearch][main] Using a default mapping template {:es_version=>7, :ecs_compatibility=>:disabled}
[2022-08-26T14:18:13,288][INFO ][logstash.javapipeline    ][main] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>500, "pipeline.sources"=>["/etc/logstash/conf.d/logstash_stat_EC_JDBC.conf"], :thread=>"#<Thread:0x1fd761c3 run>"}
[2022-08-26T14:18:13,332][INFO ][logstash.outputs.elasticsearch][main] Attempting to install template {:manage_template=>{"index_patterns"=>"logstash-*", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s", "number_of_shards"=>1}, "mappings"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}
[2022-08-26T14:18:14,421][INFO ][logstash.javapipeline    ][main] Pipeline Java execution initialization time {"seconds"=>1.12}
[2022-08-26T14:18:14,558][INFO ][logstash.javapipeline    ][main] Pipeline started {"pipeline.id"=>"main"}
[2022-08-26T14:18:14,730][INFO ][logstash.agent           ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[2022-08-26T14:18:15,183][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}
[2022-08-26T14:20:03,864][INFO ][logstash.inputs.jdbc     ][main][2587daf677b21735d56517aabc45ba084e399d2483fa1994d846171cc9242d02] (2.933261s) SELECT messageid, msg.codestatut AS codestatutmessage, datecreationmessage, libellemessageza, libellemessagezu, codemessage,
        msg.codecanal AS codecanalmessage, msg.codepubliable AS codepubliablemessage , codenotification, codestructuremessage, libellestructuremessage, flaglecture,
        dem.demandeid, codedemande, dem.codestatut AS codestatutdemande, datecreationdemande, libelledemandeza, libelledemandezu, dem.codepubliable AS codepubliabledemande,
        dem.codecanal AS codecanaldemande, dem.codesa, codetype, codeimportance, codestructurecreateur, libellestructurecreateur, urlsaorigine,
        datefindemande, idreferenceorigine, traitementauto, datederniermessage, codestructuredernmsg, contentieux,
        codecategorie, codecopro, codepcr, codepcrcategorie,
        st.libellelongstatutza,
        canal.libellecanal,
        extract(epoch from datecreationmessage) AS unix_ts_in_secs

FROM gaia2.m_message AS msg
INNER JOIN gaia2.m_demande AS dem ON msg.demandeid = dem.demandeid
INNER JOIN gaia2.t_statut AS st ON dem.codestatut = st.codestatut
INNER JOIN gaia2.t_canal AS canal ON dem.codecanal = canal.codecanal
WHERE (extract(epoch from datecreationmessage) > 0 AND datecreationmessage < NOW())
ORDER BY datecreationmessage ASC

[2022-08-26T14:25:00,494][INFO ][logstash.inputs.jdbc     ][main][2587daf677b21735d56517aabc45ba084e399d2483fa1994d846171cc9242d02] (0.064751s) SELECT messageid, msg.codestatut AS codestatutmessage, datecreationmessage, libellemessageza, libellemessagezu, codemessage,
        msg.codecanal AS codecanalmessage, msg.codepubliable AS codepubliablemessage , codenotification, codestructuremessage, libellestructuremessage, flaglecture,
        dem.demandeid, codedemande, dem.codestatut AS codestatutdemande, datecreationdemande, libelledemandeza, libelledemandezu, dem.codepubliable AS codepubliabledemande,
        dem.codecanal AS codecanaldemande, dem.codesa, codetype, codeimportance, codestructurecreateur, libellestructurecreateur, urlsaorigine,
        datefindemande, idreferenceorigine, traitementauto, datederniermessage, codestructuredernmsg, contentieux,
        codecategorie, codecopro, codepcr, codepcrcategorie,
        st.libellelongstatutza,
        canal.libellecanal,
        extract(epoch from datecreationmessage) AS unix_ts_in_secs

FROM gaia2.m_message AS msg
INNER JOIN gaia2.m_demande AS dem ON msg.demandeid = dem.demandeid
INNER JOIN gaia2.t_statut AS st ON dem.codestatut = st.codestatut
INNER JOIN gaia2.t_canal AS canal ON dem.codecanal = canal.codecanal
WHERE (extract(epoch from datecreationmessage) > 1661520252.039 AND datecreationmessage < NOW())
ORDER BY datecreationmessage ASC

...

But how did you ran logstash when running in the CLI? You didn't say.

Assuming that you installed Logstash using the RPM, when you run it as a Service it will run as the logstash user, if you run it in the command line as root or using sudo, it will run as the root user, this can give you issues since Logstash create some files and if you try to run it as a service after the user may not have all the permissions it needs.

Do you have the file /usr/share/logstash/.logstash_jdbc_last_run ?

Sorry, while in CLI, i ran logstash as sudo, and the file $HOME/.logstash_jdbc_last_run is created and updated.
And for the 2nd question, i don't have the file /usr/share/logstash/.logstash_jdbc_last_run

Your explaination solved my problem : as you mentionned, .logstash_jdbc_last_run needs to have logstash user permission.
So, i changed the default path by adding the parameter :

last_run_metadata_path => "/usr/share/logstash/.logstash_jdbc_last_run"

Now, as this file has the logstash user permission : -rw-r--r-- 1 logstash logstash 19 26 août 18:05 .logstash_jdbc_last_run, the file is created and updated correctly. So, the service restart also works as expected.
Thank you so much

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.