LogStash setting Date error

Hello guys! I want to run my logtstash connect to elastic using input jdbc.
Then i create conf file to run logstash but i always get this error, i try any changes in conf file but still not working. Anyone can help me to figure out this error? I am really stuck. Hope you guys can share the way to fix this case. Thank you!
error message


confing

Hello @yy_isam

Welcome to elastic community :slight_smile:

As per the log it states the below ERROR

Try setting ENV['TZ'] = 'Continent/City' error in logstash

Hence , Add TZ as environmental variables and add as values "Continent/City".
Steps:
This PC -> Advanced System Settings -> System Variables -> New -> TZ (Variable) -> Continent/City (Value).

There is a typo in your config.

You are using Asia/Tapei, but it is Asia/Taipei.

thk guys , but i change this error , error still have...

thk guys help , i have setting you says this , but still error ...

Please share your entire Logstash configuration pipeline as plain text, using the Preformatted option, the </> button.

Share a sample of your message as well so people can try to reproduce your pipeline and see what is wrong.

Also share the most recent error log you have, also as plain text.

Avoid sharing screenshots of configurations and log errors as they can be pretty bad to read and it is impossible to replicate.

filter {
  grok {
    match => { "message" => "^%{NGINX_ACCESSLOG_COMBINED}$" }
  }
mutate { add_field => { "@timestamp_source" => "1234567890123456789" } }
  date {
    match => [ "timestamp", "yyyy/MM/dd HH:mm:ss Z" ]
	timezone => "Asia/Taipei"
    target=> "@timestamp" 
  }
}
input {
  beats {
    port => 5044
  }
   jdbc {
    jdbc_connection_string => "jdbc:sqlserver://192.168.100.100,1433;encrypt=true;database=E;integratedSecurity=false;"
    jdbc_driver_library => "D:\elk\insert\JDBC\mssql-jdbc-12.2.0.jre8.jar"
    jdbc_driver_class => "com.microsoft.sqlserver.jdbc.SQLServerDriver"
    jdbc_user => "aaa"
    jdbc_password => "!QAZxsw23edc"
	jdbc_default_timezone => "Asia/Taipei"
    schedule => "*/1 * * * *"
    last_run_metadata_path => "D:\elk\op\elasticsearch-8.8.0-windows-x86_64\elasticsearch-8.8.0\mssqllog\FMFileFolder.txt"
 
    statement => "SELECT     fmfilefolderguid, syswebsiteguid, SYSDepartmentGUID, ParentFileFolderGUID, Path, FolderName, FolderDesc, DefFileEntryTypeId, LastPostDate, 
                      ProcessUserID, ProcessDate, SortOrder, IsDelete
                  FROM         EPA_FM.dbo.FMFileFolder Where  ProcessDate > :sql_last_value"
     
  type => "folder"
  }
  
}



output {   
  if [type] == "folder" {
    elasticsearch {
      hosts => ["http://127.0.0.1:9200"]
      index => "filefolder"
      document_id => "%{fmfilefolderguid}"
	  user => "elastic" 
	  password => "a123456" 
    }
  }

}

Is this your entire pipeline? It starts with filter block or you have anything above it?

You have a beats input that will not output because you do not have an output for it, it makes no sense to have this if this is your entire pipeline.

How are you starting logstash? By command line or as a service? Is this the only pipeline you are running or are you using multiple pipelines with pipelines.yml?

Also, share a recent log error.

hello @leandrojmp
sorry , this's my first question , if i forget anything tell me

thk

cmd

.\bin\logstash -f .\config\logstash.conf --log.level debug

error message

[ERROR][logstash.agent           ] Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"Java::JavaLang::IllegalStateException", :message=>"Unable to configure plugins: (ArgumentError) Cannot determine timezone from nil\n(secs:1686631138.0,utc~:\"2023-06-13 04:38:58.0\",ltz~:nil)\n(etz:nil,tnz:\"TST\",tziv:\"2.0.6\",tzidv:\"1.2023.3\",rv:\"2.6.8\",rp:\"java\",win:true,rorv:nil,astz:nil,eov:\"1.2.7\",eotnz:\"???\",eotnfz:\"???\",eotlzn:\"???\",\ndebian:nil,centos:nil,osx:nil)\nTry setting `ENV['TZ'] = 'Continent/City'` in your script (see https://en.wikipedia.org/wiki/List_of_tz_database_time_zones)", :backtrace=>["org.logstash.config.ir.CompiledPipeline.<init>(CompiledPipeline.java:120)", "org.logstash.execution.AbstractPipelineExt.initialize(AbstractPipelineExt.java:186)", "org.logstash.execution.AbstractPipelineExt$INVOKER$i$initialize.call(AbstractPipelineExt$INVOKER$i$initialize.gen)", "org.jruby.internal.runtime.methods.JavaMethod$JavaMethodN.call(JavaMethod.java:846)", "org.jruby.ir.runtime.IRRuntimeHelpers.instanceSuper(IRRuntimeHelpers.java:1229)", "org.jruby.ir.instructions.InstanceSuperInstr.interpret(InstanceSuperInstr.java:131)", "org.jruby.ir.interpreter.InterpreterEngine.processCall(InterpreterEngine.java:361)", "org.jruby.ir.interpreter.StartupInterpreterEngine.interpret(StartupInterpreterEngine.java:72)", "org.jruby.internal.runtime.methods.MixedModeIRMethod.INTERPRET_METHOD(MixedModeIRMethod.java:128)", "org.jruby.internal.runtime.methods.MixedModeIRMethod.call(MixedModeIRMethod.java:115)", "org.jruby.runtime.callsite.CachingCallSite.cacheAndCall(CachingCallSite.java:329)", "org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:87)", "org.jruby.RubyClass.newInstance(RubyClass.java:911)", "org.jruby.RubyClass$INVOKER$i$newInstance.call(RubyClass$INVOKER$i$newInstance.gen)", "org.jruby.runtime.callsite.CachingCallSite.cacheAndCall(CachingCallSite.java:329)", "org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:87)", "org.jruby.ir.instructions.CallBase.interpret(CallBase.java:549)", "org.jruby.ir.interpreter.InterpreterEngine.processCall(InterpreterEngine.java:361)", "org.jruby.ir.interpreter.StartupInterpreterEngine.interpret(StartupInterpreterEngine.java:72)", "org.jruby.ir.interpreter.InterpreterEngine.interpret(InterpreterEngine.java:92)", "org.jruby.internal.runtime.methods.MixedModeIRMethod.INTERPRET_METHOD(MixedModeIRMethod.java:238)", "org.jruby.internal.runtime.methods.MixedModeIRMethod.call(MixedModeIRMethod.java:225)", "org.jruby.internal.runtime.methods.DynamicMethod.call(DynamicMethod.java:226)", "org.jruby.runtime.callsite.CachingCallSite.cacheAndCall(CachingCallSite.java:393)", "org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:206)", "org.jruby.ir.interpreter.InterpreterEngine.processCall(InterpreterEngine.java:325)", "org.jruby.ir.interpreter.StartupInterpreterEngine.interpret(StartupInterpreterEngine.java:72)", "org.jruby.ir.interpreter.Interpreter.INTERPRET_BLOCK(Interpreter.java:116)", "org.jruby.runtime.MixedModeIRBlockBody.commonYieldPath(MixedModeIRBlockBody.java:136)", "org.jruby.runtime.IRBlockBody.call(IRBlockBody.java:66)", "org.jruby.runtime.IRBlockBody.call(IRBlockBody.java:58)", "org.jruby.runtime.Block.call(Block.java:143)", "org.jruby.RubyProc.call(RubyProc.java:309)", "org.jruby.internal.runtime.RubyRunnable.run(RubyRunnable.java:107)", "java.base/java.lang.Thread.run(Thread.java:833)"]}

pipelines.yml not seeting anything

logstash.conf

filter {
  grok {
    match => { "message" => "^%{NGINX_ACCESSLOG_COMBINED}$" }
  }
mutate { add_field => { "@timestamp_source" => "1234567890123456789" } }
  date {
    match => [ "timestamp", "yyyy/MM/dd HH:mm:ss Z" ]
	timezone => "Asia/Taipei"
    target=> "@timestamp" 
  }
}
input {
  beats {
    port => 5044
  }
   jdbc {
    jdbc_connection_string => "jdbc:sqlserver://192.168.100.100,1433;encrypt=true;database=E;integratedSecurity=false;"
    jdbc_driver_library => "D:\elk\insert\JDBC\mssql-jdbc-12.2.0.jre8.jar"
    jdbc_driver_class => "com.microsoft.sqlserver.jdbc.SQLServerDriver"
    jdbc_user => "aaa"
    jdbc_password => "!QAZxsw23edc"
	jdbc_default_timezone => "Asia/Taipei"
    schedule => "*/1 * * * *"
    last_run_metadata_path => "D:\elk\op\elasticsearch-8.8.0-windows-x86_64\elasticsearch-8.8.0\mssqllog\FMFileFolder.txt"
 
    statement => "SELECT     fmfilefolderguid, syswebsiteguid, SYSDepartmentGUID, ParentFileFolderGUID, Path, FolderName, FolderDesc, DefFileEntryTypeId, LastPostDate, 
                      ProcessUserID, ProcessDate, SortOrder, IsDelete
                  FROM         EPA_FM.dbo.FMFileFolder Where  ProcessDate > :sql_last_value"
     
  type => "folder"
  }
  
}



output {   
  if [type] == "folder" {
    elasticsearch {
      hosts => ["http://127.0.0.1:9200"]
      index => "filefolder"
      document_id => "%{fmfilefolderguid}"
	  user => "elastic" 
	  password => "a123456" 
    }
  }

}

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.