How to use Multiple if statements in filter section of logstash configuration file

Hi All,

I am a newbie to the elk.

I am currently using logstash version 7.2.

My issue here is to update and new values to the keywords,

Below is my logstash configuration file

        input {
            http {
              host => "0.0.0.0"
              port => xxxx
              threads => 100
            }
          }
          filter {
        if [headers][request_method] == "GET" {
          drop{}
        } else {
              json {
                source => "message"
            remove_field => [ "headers", "message", "host" ]
              }
              date {
                match => [ "timeStamp", "ISO8601" ]
                target => "timeStamp"
              }
              fingerprint {
                source => [ "origin", "originLogId" ]
            target => "[@metadata][fingerprint]"
                method => "SHA256"
                key => "@AUDITTRAIL-NOSQL@"
                concatenate_sources => true
              }
    if [service] in ["F_N_C_R", "F_N_C_R_W_N"]
    {
       mutate {
        update => { "service" => "F N C" }
        }
        if [result] in ["HIT"]
        {
           mutate {
            update => { "result" => "Number of biometric HIT" }
            }
        }
        if [result] in ["NO_HIT"]
        {
           mutate {
            update => { "result" => "Number of biometric NO HIT" }
            }
        }
        if [result] in ["REQUEST_ACCEPTED"]
        {
           mutate {
            update => { "result" => "Identity Accepted" }
            }
        }
        if [result] in ["REQUEST_REJECTED"]
        {
           mutate {
            update => { "result" => "Identity Rejected" }
            }
        }
        if [service] in ["IDENTITY_MERGE"]
        {
           mutate {
            update => { "service" => "Identity merged" }
            }
        }
        if [AT_VAL1] in ["APPROVED"]
        {
           mutate {
            update => { "AT_VAL1" => "Number of approved requests" }
            }
        }
        if [AT_VAL1] in ["NOT_APPROVED"]
        {
           mutate {
            update => { "AT_VAL1" => "Number of rejected requests" }
            }
        }
        if [AT_VAL1] in ["PERSONALISE"]
        {
           mutate {
            update => { "AT_VAL1" => "Personalized cards" }
            }
        }
        if [AT_VAL1] in ["DISPATCH"]
        {
           mutate {
            update => { "AT_VAL1" => "Dispatched cards" }
            }
        }
        if [AT_VAL1] in ["SECURED"]
        {
           mutate {
            update => { "AT_VAL1" => "Issued cards" }
            }
        }
        if [AT_VAL1] in ["REVOCATION"]
        {
           mutate {
            update => { "AT_VAL1" => "Terminated cards" }
            }
        }

        }
          }
          output {
            elasticsearch {
              hosts => [ "xx.xx.xx.xx:xxxx" ]
              index => "test-%{+YYYY.MM.dd}"
              document_id => "%{[@metadata][fingerprint]}"
            }
          }

So the issue is the first if statement works perfectly for my Elasticsearch data,
But my subsequent if statements aren't working
The logstash able to read the configuration with no errors but in kibana my data isn't coming as i expect.

Please help to fix this issue

So you are saying that the test of [headers][request_method] works, but none of the tests against the parsed JSON work? If that is the case, what does an event look like? Either add

output { stdout { codec => rubydebug } }

or expand an event in the Discover pane in kibana and copy the text from the JSON tab.

if [AT_VAL1] in ["SECURED"]

That is an odd way to do a string comparison. "in" here is an array membership test, for an array with a single member. It looks to me equivalent to

if [AT_VAL1] == "SECURED"

Note that it is certainly different to

if [AT_VAL1] in "SECURED"

which is a substring match, which would test true if [AT_VAL1] were equal to "S" or "URE".

you can use else if which works perfectly for me. I have multiple filters for each input in a single pipeline.
Example:

        input {
            beats {
                type => "input1"
                ...
            }
            beats {
                type => "input2"
                ...
            }
        }
        filter {
             if [type] == "input1" {
                 ...
             }
             else if [type] == "input2" {
                 ...
             }
              ....
        }
1 Like

Hi All

Thanks a lot for your replies and the solution,Based on your solution i have changed my code in the filter section but still no changes are there in my Kibana dashboard.

The strangest part is my first condition always works ,but after that the subsequent filters aren't applicable at all to my ingested data.

And there is neither any syntax nor method error in filter part as my logstash service perfectly able to read the configuration file and able to launch the service.

PFB the updated code by making the changes as suggested above and please point put where am i making any mistake

input {
    http {
      host => "0.0.0.0"
      port => 9090
      threads => 100
    }
  }
  filter {
if [headers][request_method] == "GET" {
  drop{}
} else {
      json {
        source => "message"
    remove_field => [ "headers", "message", "host" ]
      }
      date {
        match => [ "timeStamp", "ISO8601" ]
        target => "timeStamp"
      }
      fingerprint {
        source => [ "origin", "originLogId" ]
    target => "[@metadata][fingerprint]"
        method => "SHA256"
        key => "@AUDITTRAIL-NOSQL@"
        concatenate_sources => true
      }
if [service] == ["F_N_C_R_W_N", "F_N_C_R_N"]
{
   mutate {
    update => { "service" => "F N C" }
    }
}
else if [result] == ["HIT"]
{
   mutate {
    update => { "result" => "Number of  HIT" }
    }
}
else if [result] == ["NO_HIT"]
{
   mutate {
    update => { "result" => "Number of  NO HIT" }
    }
}
else if [result] == ["REQUEST_ACCEPTED"]
{
   mutate {
    update => { "result" => "Identity Accepted" }
    }
}
else if [result] == ["REQUEST_REJECTED"]
{
   mutate {
    update => { "result" => "Identity Rejected" }
    }
}
else if [service] == ["IDENTITY_MERGE"]
{
   mutate {
    update => { "service" => "Identity merged" }
    }
}
else if [AT_VAL1] == ["APPROVED"]
{
   mutate {
    update => { "AT_VAL1" => "Number of approved requests" }
    }
}
else if [AT_VAL1] == ["NOT_APPROVED"]
{
   mutate {
    update => { "AT_VAL1" => "Number of rejected requests" }
    }
}
else if [AT_VAL1] == ["PERSONALISE"]
{
   mutate {
    update => { "AT_VAL1" => "Personalized" }
    }
}
else if [AT_VAL1] == ["DISPATCH"]
{
   mutate {
    update => { "AT_VAL1" => "Dispatched" }
    }
}
else if [AT_VAL1] == ["SECURED"]
{
   mutate {
    update => { "AT_VAL1" => "Issued" }
    }
}
else if [AT_VAL1] == ["REVOCATION"]
{
   mutate {
    update => { "AT_VAL1" => "Terminated" }
    }
}
}
  }

if all of your logs has GET in [headers][request_method] then it will all get dropped

can you provide some sample data that should have been processed by other conditions? you could set output to stdout as badger suggested

Hi Ptamba,

here is a sample output,where i want to mutate the value of the result keyword
> {

          "_index": "example-test",
          "_type": "_doc",
          "_id": "f0c097290a09ab38d9bd7541c02a728c98a5a2df41e77a32765ec26ca2065282",
          "_version": 1,
          "_score": 1,
          "_source": {
            "internalId": "1804",
            "@timestamp": "2020-05-30T23:15:09.399Z",
            "AT_CUSTO_S5": "",
            "AT_VAL2": "Male",
            "AT_CUSTO_S9": "",
            "ER_STATE": "value_40",
            "origin": "linux",
            "AT_VAL10": "",
            "AT_VAL5": "TECHNICAL",
            "AT_CUSTO_S8": "",
            "result": "REQUEST_REJECTED",
            "host": "xx.xx.xx.xx",
            "AT_VAL8": "NORMAL",
            "headers": {
              "request_path": "/",
              "request_method": "PUT",
              "content_length": "1041",
              "accept_encoding": "gzip,deflate",
              "connection": "Keep-Alive",
              "http_version": "HTTP/1.1",
              "http_host": "xx.xx.xx.xx:9090",
              "http_accept": null,
              "content_type": "application/json;charset=UTF-8",
              "http_user_agent": "Apache-HttpClient/4.5.7 (Java/1.8.0_102)"
            },
            "AR_STATUS": "value_30",
            "AT_VAL1": "SERVICE_TASK_RESULT",
            "businessId": "1234520100119904",
            "LOG_QUALIFIER": "USER_TASK_RESULT",
            "AT_VAL4": "INVESTIGATION_PROCESSED",
            "logCategory": "BUSINESS",
            "AT_EXTENDEDINFO": "",
            "AT_OUTSTANDINGVAL": "",
            "owner": "anonymous",
            "AT_CUSTO_S3": "309",
            "AT_VAL3": "37",
            "@version": "1",
            "ER_ID": "value_70",
            "REQUEST_TYPE": "value_60",
            "AT_VAL7": "STARTED",
            "USER_TYPE": "value_90",
            "timeStamp": "2015-01-01T01:02:03.904Z",
            "AT_CUSTO_S6": "",
            "AT_CUSTO_S4": "",
            "service": "F_N_C_R_W_N",
            "station": "origin",
            "AT_VAL6": "Investigation",
            "PRIORITY_LEVEL": "value_80",
            "duration": 150,
            "originLogId": "6dd11067-a2cb-11ea-b158-d94038694644",
            "AT_CUSTO_S2": "0485837819",
            "AT_CUSTO_S10": "",
            "AT_CUSTO_S7": "",
            "ENTITLEMENT_TYPE": "value_100",
            "AT_CUSTO_S1": "",
            "AT_VAL9": "49",
            "ER_STATUS": "value_50",
            "activity": "CheckInvestigationStatus"
          },
          "fields": {
            "timeStamp": [
              "2015-01-01T01:02:03.904Z"
            ],
            "@timestamp": [
              "2020-05-30T23:15:09.399Z"
            ]
          }
        }

and also can you please explain me with an example what the drop will do ,as you posted in your previous reply

so based on your output, you’re expecting the value of result updated to “Identity Rejected”? if so try changing it to

else if [result] == “REQUEST_REJECTED” 
{ 
 mutate {
    update => { "result" => "Identity Rejected" }
    }
}
1 Like

Thanks aaaa lotttt @ptamba ,Your trick worked perfectly and the values are coming in the dashboard as i wanted.

But still i got a small issue ,as you can see in my above code i am filtering and mutating multiple fields.

According to the client the values has to be in a certain order,But in Kibana when i am filtering and placing the values in data table it's sorting the values alphabetically .

Is there any way to sort the values as i want and not in a alphabetical way in the kibana data table visualization

you might get a better answer for kibana related topics in kibana forum. however , to my limited knowledge on kibana, you should be able to sort fields in you visualizations as long as their sortable types (such as keyword, date, or numerics).

Hi @ptamba @vikasp @Badger

I have one requirement for which i need one help from you all

In my elastic search i have a keyword as owner.keyword which is to be used as dynamic filter

Which i have completed using the controls visualization and the visualization looks like this now

image

but my client wants a new value in the owner keyword named as "All" ,
which when selected will select all the users at a time .

How to create and add that "All" into the owner.keyword.

Please help me on this.

Just to add more information about logstash conditionals: actually the "compare the content of a field with an array" will not test true if there is only 1 element in the array.

if [AT_VAL1] in ["SECURED"] # will never test true because it has only 1 element (logstash bug)
if [AT_VAL1] in ["SECURED", "SECURED"] # is equivalent to if [AT_VAL1] == "SECURED"

I've just discovered the substring match comparison in your last paragraph and it's :exploding_head: although I don't have a direct use for it.

@Matish_Bhuyan for a new question it would be better to ask it in the kibana forum.

Anyway, the standard behavior of a control is to list only the existing values, I presume it won't be easy to change that. It may be easier to convince the clients to simply not filter by that field :smiley:

Did exactly the same and thankfully they are happy to accept this.

Thank you everyone for your timely responses

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.