Field merge

Hello,

Please tell me what is my mistake?

grok {
      match => { "message" => ["%{INT:Min}:%{BASE10NUM:Sec}-%{INT:Duration},(%{WORD:Event}|%{SPACE:Event}),%{INT:Level}"] }
      match => { "log.file.path" => ["%{INT:TempYYMMDDHH}.log"] }
}

mutate {
      add_field => { "MyTime" => "%{TempYYMMDDHH}%{Min}%{Sec}" }
      remove_field => ["TempYYMMDDHH", "Min", "Sec"]
}
date {
      match => ["MyTime", "yyMMddHHmmss.SSSSSS"]
      target => "@timestamp"
}

Trying to combine three fields into one value. At the output, I get the value of the "MyTime" field of the following form: %{TempYYMMDDHH}5955.073000

Input data:
message => 59:55.073000-0,CONN,0
log.file.path => C:\Users\Username\Documents\logs\rmngr_3496\20021323.log

Best regards!

What is the error here ?

I dont know)
In field "MyTime" I get this value "%{TempYYMMDDHH}5955.073000"
but i need to get this "200213235955.073000"
and this in timestamp "Feb 13, 2020 @ 23:59:55.073"

Hi

Comment out the remove_field line, so you can see what those variables contain, so you can debug your config.

My money is on an empty or undefined %{TempYYMMDDHH}.

Hope this helps.

You're right
TempYYMMDDHH is not in the index. What is the reason for this behavior?

log.file.path is not empty and in grok debugger there are no line parsing errors

Hi

Maybe INT is not the right parser for your file path. I'd try a more generic one, like DATA, or even GREEDYDATA, just to test it.

Adding a \ in front, as there is one in your file path, might help too:

match => { "log.file.path" => ["\%{GREEDYDATA:TempYYMMDDHH}.log"] }

You might need to escape that "\".

Post the output of stdout{}.

Hope this helps

Unfortunately, but that didn't help either.

I did the same in Elasticsearch without logstash and it worked. But when transferring to logstash, log.file.path is not written to the field

Example from elasticsearch:

PUT _ingest/pipeline/onectj_pipeline 
{
  "description" : "onec tj pipeline",
  "processors": [
    {
      "set": {
        "field": "LogRowsID",
        "value": [
          "{{source}}", "{{_source.offset}}"
        ]
      }
    },
    {
      "grok": {
        "field": "message",
        "patterns": [
          "%{INT:_ingest.TempMM}:%{BASE10NUM:_ingest.TempSS}-%{INT:Duration},(%{WORD:Event}|%{SPACE:Event}),%{INT:Level}"
        ]
      }
    },
    {
      "grok": {
        "field": "log.file.path",
        "patterns": [
          "%{INT:_ingest.TempYYMMDDHH}.log"
        ]
      }
    },
    {
      "set": {
        "field": "_ingest.Tempdate",
        "value": "{{_ingest.TempYYMMDDHH}}{{_ingest.TempMM}}{{_ingest.TempSS}}"
      }
    },
    {
      "date": {
        "field": "_ingest.Tempdate",
        "target_field": "@timestamp",
        "formats": [
          "yyMMddHHmmss.SSSSSS"
        ],
        "timezone": "Europe/Moscow"
      }
    },
    {
      "set": {
        "field": "_id",
        "value": "{{_ingest.Tempdate}}-{{_source.offset}}"
      }
    }
  ]
}

Are you sure? I have more often seen [log][file][path] (a [log] object containing a [file] object containing a [path] field, rather than a [log.file.path] field which has periods in the object name).

1 Like

Hi

As @Badger says, make sure you have a field called log.file.path to begin with. As he suggested, the right way to call nested fields in logstash is [a][b][c], not a.b.c.

I'd suggest using underscores, for instance, instead of dots in field names, as a general rule to avoid confusion, so you'll have a_b_c and never be confused again.

Hope this helps

1 Like

Thank you very much for you help! It solved my problem

Decision:

match => { "[log][file][path]" => ["%{INT:TempYYMMDDHH}.log"] }

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.