How to create a new field by extracting data from already generated field?

This is how my source field looks like,

Capture

the filename is in the format YYYYMMDD.HHMM.min
so the above file was created in the year:- 2045 , month:- 02 ,date:- 18 , and time :- 09:54

what i want to do is from the above source field extract date and time and create a new field which contains only date and time,
can anybody please tell my how can i do it?
Please Help!!

You can use a grok filter with source as the input field and then just create a grok pattern to match.

that i know but,
this is my logstash.conf

input {
  beats {
    port => 5044
  }
}

filter {
  if [fields][log_type] == "syslog" {
    grok {
      match => { "message" => "(#!TOT:)?320:%{NUMBER:q}:%{NUMBER:q1}:%{NUMBER:q2}:%{NUMBER:q3}:%{NUMBER:q4}:%{NUMBER:q5}:%{NUMBER:st2} %{NUMBER:st3} %{NUMBER:st4} %{NUMBER:st5} %{NUMBER:st6} %{NUMBER:st7} %{NUMBER:st8} %{NUMBER:st9} %{NUMBER:st10} %{NUMBER:st11} %{NUMBER:st12} %{NUMBER:st13} %{NUMBER:st14} %{NUMBER:st15} %{NUMBER:st16} %{NUMBER:st17} %{NUMBER:st18} %{NUMBER:st19} %{NUMBER:st20} %{NUMBER:st21} %{NUMBER:st22} %{NUMBER:st23} %{NUMBER:st24} %{NUMBER:st25} %{NUMBER:st26} %{NUMBER:st27} %{NUMBER:st28} %{NUMBER:st29} :" }
      match => { "message" => "(#!TOT:)?83:%{NUMBER:q}:%{NUMBER:q1}:%{NUMBER:q2}:%{NUMBER:q3}:%{NUMBER:q4}:%{NUMBER:q5}:%{NUMBER:st2} %{NUMBER:st3} %{NUMBER:st4} %{NUMBER:st5} %{NUMBER:st6} %{NUMBER:st7} %{NUMBER:st8} %{NUMBER:st9} %{NUMBER:st10} %{NUMBER:st11} :" }
    }
  }
  if ("_grokparsefailure" in [tags]) { drop {} }
  else {
    mutate {
      remove_field => ["[q]","[q1]","[q2]","[q3]","[q4]","[st1]","[st3]","[st5]","[st6]","[st7]","[st8]","[st9]","[st10]","[st11]","[st12]","[st14]","[st15]","[st17]","[st18]","[st19]","[st20]","[st21]","[st22]","[st23]","[st24]","[st25]","[st26]","[st28]","[st29]"]
      convert => {
        "st4" => "integer"
        "st2" => "integer"
        "st16" => "integer"
        "st13" => "integer"
        "st27" => "integer"
      }
    }
  }
}

output {
  elasticsearch {
    hosts => ["localhost:9200"]
    sniffing => true
    manage_template => false
    index => "%{[@metadata][beat]}-%{+YYYY.MM.dd}"
    document_type => "%{[@metadata][type]}"
  }
}

i already have two filters inside the filter if the logs match to any of the matches it skips the other one,where should i put the filter to extract keywords from source field in my code?can you suggest me any edit or solution?

Just add another grok filter:

grok{
       match => { "source" => "%{UNIXPATH}/%{YEAR}%{MONTHNUM}%{MONTHDAY}.%{HOUR}%{MINUTE}.min"
}

This should give you output that looks like this:

{
  "UNIXPATH": [
    [
      "/var/log/1min"
    ]
  ],
  "YEAR": [
    [
      "2045"
    ]
  ],
  "MONTHNUM": [
    [
      "02"
    ]
  ],
  "MONTHDAY": [
    [
      "18"
    ]
  ],
  "HOUR": [
    [
      "09"
    ]
  ],
  "MINUTE": [
    [
      "54"
    ]
  ]
}

thanks a lot!!!

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.