Logstah doesn't split log into separate fields in elastic index

Hi everyone!

I am trying to import logs to elastic index using Filebeat and Logstah.
The index is created but whole log line is present in one field message.

For start I'm trying to separate datetime to a separate field and most likely make it a timestamp in kibana.

He is an example of my log:

2024-10-27 22:00:32.289 [DefaultQuartzScheduler_Worker-5] DEBUG c.c.t.s.a.f.t.i.SomeMethod[run()][line 154]: Some log message here "with quatation" or without

in logstah.conf I have grok expression like this:

filter {
  if [type] == "logfile" {
    grok {
      match => { 'message' => "%{DATA:fc_timestamp}" }
    }
	date {
      match => [ "fc_timestamp", "yyyy-MM-dd HH:mm:ss.SSS"]
	  target => "fc_timestamp"
    }
  }
}

Unfortunately it doesn't create fc_timestamp field in index.

Whole grok which I tested in some grok debbuger and it seems to work is:

match => { 'message' => "%{DATA:fc_timestamp} \[%{DATA:worker_number}\] %{LOGLEVEL:log_level} %{DATA:method}: %{GREEDYDATA:log}" }

But for now I am struggling just to separete log entry date and best to make it a timestamp.
Please help.

I did some changes to logstah.conf to see if there is any improvement but it there isn't:

input {
  beats {
    port => 5044
	type => fc_logfile
  }
}

filter {
  if [type] == "fc_logfile" {
    grok {
      match => { 'message' => "%{DATA:fc_timestamp}" }
    }
	date {
      match => [ "fc_timestamp", "yyyy-MM-dd HH:mm:ss.SSS"]
	  target => "@timestamp"
    }
  }
}

Unfortunately @timestamp in kibana is still holding date and time of import and not log record datetime.

image

The only change I can see is in type field

image

Any help will be much appreciated.

Regards!

DATA can match anything, including nothing at all. If you add keep_empty_captures => true to the grok filter then you will see that "nothing at all" is exactly what it is matching.

"fc_timestamp" => "",

So you need to modify the pattern to force DATA to capture something. If you use %{DATA:fc_timestamp} with a trailing space then you will get

"fc_timestamp" => "2024-10-27",

to force the capture of whole timestamp try

grok { match => { 'message' => "%{DATA:fc_timestamp} \[" } }

Your date filter will then work.

Building a grok approach incrementally is a great approach to creating complex patterns, you just need to start with enough to get an initial match :slight_smile:

Thank You for your help!