Extract originals timestamp from logs?

hello i've see with send log with logstash not respect the originals timestamp

this is my generic msg builded with spring-boot

2022-09-25 21:19:57,175 INFO  [it.unidoc.cdr.core.bean.ForwardProvideAndRegisterManager] (Camel (camel-1) thread #153 - WireTap) Evaluate 2 item in OR...
2022-09-25 21:19:57,175 INFO  [it.unidoc.cdr.core.route.Iti41Repository] (Camel (camel-1) thread #153 - WireTap) urn:uuid:46b6a84b-e88c-4cc5-86db-87b588694a26 promoted for forwarding to http://feii-fsebroker:8081/fsebroker/services/submit...
2022-09-25 21:19:57,189 INFO  [it.unidoc.cdr.core.route.Iti41MllpSource] (TcpSocketConsumerRunnable[mllp://0.0.0.0:9010] - /192.168.0.7:55788 => /192.168.0.32:9010) Response from mllp://0.0.0.0:9010?autoAck=true&maxConcurrentConsumers=100: SUCCESS
2022-09-25 21:19:57,192 INFO  [ca.uhn.hl7v2.util.idgenerator.FileBasedGenerator] (TcpSocketConsumerRunnable[mllp://0.0.0.0:9010] - /192.168.0.7:55788 => /192.168.0.32:9010) Could not read ID file /./id_file 
2022-09-25 21:19:57,192 WARN  [ca.uhn.hl7v2.util.idgenerator.FileBasedGenerator] (TcpSocketConsumerRunnable[mllp://0.0.0.0:9010] - /192.168.0.7:55788 => /192.168.0.32:9010) Could not write ID to file /./id_file, going to use internal ID generator. /./id_file (Permission denied)
2022-09-25 21:19:57,198 WARN  [org.apache.camel.component.mllp.MllpTcpServerConsumer.9010] (TcpSocketConsumerRunnable[mllp://0.0.0.0:9010] - /192.168.0.7:55788 => /192.168.0.32:9010) Exchange property CamelMllpCloseConnectionAfterSend = true - closing connection
2022-09-25 21:19:57,200 INFO  [it.unidoc.cdr.core.route.Iti41Forward] (Camel (camel-1) thread #2 - seda://iti41forward) Forwarding to feii-fsebroker:8081/fsebroker/services/submit...
2022-09-25 21:19:57,257 INFO  [it.unidoc.cdr.core.route.Iti41Forward] (Camel (camel-1) thread #2 - seda://iti41forward) Response from seda://iti41forward: SUCCESS
2022-09-25 21:19:57,262 INFO  [it.unidoc.cdr.core.route.Iti41Forward] (Camel (camel-1) thread #2 - seda://iti41forward) Forwarded entity "[2022-09-25T21:19:57.257903] "feii-fsebroker:8081/fsebroker/services/submit"" stored
2022-09-25 21:26:43,457 INFO  [it.unidoc.cdr.core.bean.SpoolHouseKeeper] (scheduling-1) Removing /opt/wildfly/standalone/data/cdr/source/42e9f218-e796-4ec3-9e2a-a3b7e378a482...
2022-09-25 21:30:43,649 INFO  [it.unidoc.cdr.core.bean.SpoolHouseKeeper] (scheduling-1) Removing /opt/wildfly/standalone/data/cdr/spool/stream/46b6a84b-e88c-4cc5-86db-87b588694a26.temp...

is possible extract timestamp for log and send to the timestamp?

example this is original timestamp 2022-09-25 21:19:57,175

but in server i've the timestamp of time operation

i've used this configuration in my file conf but not have resutls


input {
  file {

    path => [ "/logstash_dir/P1/*.*" ]
    
    
    start_position => "beginning"
  }
}


 #Grokking Spring Boot's default log format
  grok {
    match => [ "message", 
               "(?<timestamp>%{YEAR}-%{MONTHNUM}-%{MONTHDAY} %{TIME})  %{LOGLEVEL:level} %{NUMBER:pid} --- \[(?<thread>[A-Za-z0-9-]+)\] [A-Za-z0-9.]*\.(?<class>[A-Za-z0-9#_]+)\s*:\s+(?<logmessage>.*)",
               "message",
               "(?<timestamp>%{YEAR}-%{MONTHNUM}-%{MONTHDAY} %{TIME})  %{LOGLEVEL:level} %{NUMBER:pid} --- .+? :\s+(?<logmessage>.*)"
             ]
  }

  #Parsing out timestamps which are in timestamp field thanks to previous grok section
  date {
    match => [ "timestamp" , "yyyy-MM-dd HH:mm:ss.SSS" ]
  }
}


output {
    if " ERROR " in [message] {
		elasticsearch {
			hosts => ["elasticsearch:9200"]
			index => "errors_prima"
		}
		
		
    } else {
		elasticsearch {
			hosts => ["elasticsearch:9200"]
			index => "others_prima"
		}
    }
		stdout { codec => rubydebug }
}

Yes, use comma, not dot. If you not force target, will not get an error _dateparsefailure.

 date {
    match => ["message", "yyyy-MM-dd HH:mm:ss,SSS"]
    # target => "@timestamp"
    # timezone => "Europe/Berlin"
}

Tested on LS 8.4

tnx

but grok section is correct ?

[2022-10-04T16:20:27,484][ERROR][logstash.agent           ] Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:prima,

:exception=>"LogStash::ConfigurationError", :message=>"Expected one of [ \\t\\r\\n], \"#\", \"input\", \"filter\", \"output\" at line 13, column 3 (byte 168)

after ", :backtrace=>["/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:32:in `compile_imperative'", 

"org/logstash/execution/AbstractPipelineExt.java:187:in `initialize'", "org/logstash/execution/JavaBasePipelineExt.java:72:in `initialize'", 

"/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:47:in `initialize'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline_action/create.rb:52:in 

`execute'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:383:in `block in converge_state'"]}

[2022-10-04T16:20:27,868][INFO ][org.reflections.Reflections] Reflections took 107 ms to scan 1 urls, producing 119 keys and 417 values 

[2022-10-04T16:20:28,837][WARN ][deprecation.logstash.codecs.plain] Relying on default value of `pipeline.ecs_compatibility`, which may change in a future major release

this my file

input {
  file {

    path => [ "/logstash_dir/P1/*.*" ]
    
    
    start_position => "beginning"
  }
}


 #Grokking Spring Boot's default log format
  grok {
    match => [ "message", 
               "(?<timestamp>%{YEAR}-%{MONTHNUM}-%{MONTHDAY} %{TIME})  %{LOGLEVEL:level} %{NUMBER:pid} --- \[(?<thread>[A-Za-z0-9-]+)\] [A-Za-z0-9.]*\.(?<class>[A-Za-z0-9#_]+)\s*:\s+(?<logmessage>.*)",
               "message",
               "(?<timestamp>%{YEAR}-%{MONTHNUM}-%{MONTHDAY} %{TIME})  %{LOGLEVEL:level} %{NUMBER:pid} --- .+? :\s+(?<logmessage>.*)"
             ]
  }

 date {
    match => ["message", "yyyy-MM-dd HH:mm:ss,SSS"]
    # target => "@timestamp"
    # timezone => "Europe/Berlin"
   }


output {
    if " ERROR " in [message] {
		elasticsearch {
			hosts => ["elasticsearch:9200"]
			index => "error_p1"
		}
		
		
    } else {
		elasticsearch {
			hosts => ["elasticsearch:9200"]
			index => "others_p1"
		}
    }
		stdout { codec => rubydebug }
}

You have missed "filter" and date needs the "timestamp" field. I left "message" because I used it in generator just for test. Disclaimer: I haven't test grok pattern.

input {
  file {

    path => [ "/logstash_dir/P1/*.*" ]
    
    
    start_position => "beginning"
  }
}

filter{

 #Grokking Spring Boot's default log format
  grok {
    match => [ "message", 
               "(?<timestamp>%{YEAR}-%{MONTHNUM}-%{MONTHDAY} %{TIME})  %{LOGLEVEL:level} %{NUMBER:pid} --- \[(?<thread>[A-Za-z0-9-]+)\] [A-Za-z0-9.]*\.(?<class>[A-Za-z0-9#_]+)\s*:\s+(?<logmessage>.*)",
               "message",
               "(?<timestamp>%{YEAR}-%{MONTHNUM}-%{MONTHDAY} %{TIME})  %{LOGLEVEL:level} %{NUMBER:pid} --- .+? :\s+(?<logmessage>.*)"
             ]
  }

 date {
    match => ["message", "yyyy-MM-dd HH:mm:ss,SSS"]
    # target => "@timestamp"
    # timezone => "Europe/Berlin"
   }

}


output {
    if " ERROR " in [message] {
		elasticsearch {
			hosts => ["elasticsearch:9200"]
			index => "error_p1"
		}
		
		
    } else {
		elasticsearch {
			hosts => ["elasticsearch:9200"]
			index => "others_p1"
		}
    }
		stdout { codec => rubydebug }
}

hello i've changed config but have errore

[0] "_grokparsefailure",
      "@version" => "1",

       "message" => "2022-09-22 14:26:09,326 INFO  [org.apache.camel.component.mllp.MllpTcpServerConsumer.9010] (TcpServerConsumerValidationRunnable[mllp://0.0.0.0:9010] - /192.168.0.7:54650 => /192.168.0.32:9010) startConsumer(Socket[addr=/192.168.0.7,port=54650,localport=9010]) - starting consumer",

    "@timestamp" => 2022-10-04T18:56:30.641Z,

          "host" => "5584e91225ff",

          "tags" => [

        [0] "_grokparsefailure",

        [1] "_dateparsefailure"

    ],

          "path" => "/logstash_dir/P1/server.log.2022-09-22"



filter{

 #Grokking Spring Boot's default log format
  grok {
    match => [ "message", 
               "(?<timestamp>%{YEAR}-%{MONTHNUM}-%{MONTHDAY} %{TIME})  %{LOGLEVEL:level} %{NUMBER:pid} --- \[(?<thread>[A-Za-z0-9-]+)\] [A-Za-z0-9.]*\.(?<class>[A-Za-z0-9#_]+)\s*:\s+(?<logmessage>.*)",
               "message",
               "(?<timestamp>%{YEAR}-%{MONTHNUM}-%{MONTHDAY} %{TIME})  %{LOGLEVEL:level} %{NUMBER:pid} --- .+? :\s+(?<logmessage>.*)"
             ]
  }

 date {
    match => ["message", "yyyy-MM-dd HH:mm:ss,SSS"]
    # target => "@timestamp"
    # timezone => "Europe/Berlin"
   }

}

That first timestamp is a common format use this... then LOGLEVEL then I think it is a JAVACLASS... you can work it from there, I am not a regex expert..

2022-09-25 21:19:57,175 INFO  [it.unidoc.cdr.core.bean.ForwardProvideAndRegisterManager] (Camel (camel-1) thread #153 - WireTap) Evaluate 2 item in OR...

%{TIMESTAMP_ISO8601:timestamp} %{LOGLEVEL:level}  \[%{JAVACLASS:class}\] %{GREEDYDATA:log_details} 

{
  "log_details": "(Camel (camel-1) thread #153 - WireTap) Evaluate 2 item in OR...",
  "level": "INFO",
  "class": "it.unidoc.cdr.core.bean.ForwardProvideAndRegisterManager",
  "timestamp": "2022-09-25 21:19:57,175"
}

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.