Number filtering logstash

{"name": "hello", "xy": "ab", "ip": "123.456.78.910", "abc": "AB", "nbc": 12345, "cidr": "123.456.78.9/10", 
 "last_seen_at": "2017-08-15", "first_seen_at": "2015-09-03", "categories": ["hello"]}

I have an array of numbers for the field "nbc" which i need to filter out from my log files. rest everything i need to drop. since my log file is in JSON, i've been just forwarding as is. Can someone please guide me on how to approach this problem?. I can easily grep these numbers from a shell script but I want logstash to handle this due to some logistic reasons in our pipeline. Any help is appreciated.

Adding my logstash config.

input {
file {
 path => "/path/to/file/*.json"
 start_position => "beginning"
 add_field => { "provider" => "xyz" }
 type => "abc"
 codec => "json"
 }
}

filter {
 if [provider] == "xyz" and [type] == "abc" {
 if ["nbc"] !== "1234" OR ["nbc"] !== "4567" OR ["nbc"] !==  "8910"  {
 drop {  }
}

Thanks.

@luciferdude I think the easiest option is the Prune filter which would allow you to add nbc to the whitelist so that only it would end up in Elasticsearch.

filter {
    prune {
        whitelist_names => ["nbc"]
    }
}
1 Like

Thanks Mike for the response. Actually, i want the whole line that contains these "asn" values to show up in Elastic . For ex, i have a data file with the below 3 lines. I only want the line with "asn" value "12345" to show up in Elastic.

Data File:

{"name": "hello", "xy": "ab", "ip": "123.456.78.910", "abc": "AB", "nbc": 12345, "cidr": "123.456.78.9/10", 
 "last_seen_at": "2017-08-15", "first_seen_at": "2015-09-03", "categories": ["hello"]}
{"name": "whatsup", "xy": "ab", "ip": "123.456.78.910", "abc": "AB", "nbc": 4567, "cidr": "123.456.78.9/10", 
 "last_seen_at": "2017-08-15", "first_seen_at": "2015-09-03", "categories": ["whatsup"]}
{"name": "how r u", "xy": "ab", "ip": "123.456.78.910", "abc": "AB", "nbc": 8910, "cidr": "123.456.78.9/10", 
 "last_seen_at": "2017-08-15", "first_seen_at": "2015-09-03", "categories": ["how r u"]}

Expected Output in Elastic. Dropping the remaining lines.

 {"name": "hello", "xy": "ab", "ip": "123.456.78.910", "abc": "AB", "nbc": 12345, "cidr": "123.456.78.9/10", 
     "last_seen_at": "2017-08-15", "first_seen_at": "2015-09-03", "categories": ["hello"]}
if ["nbc"] !== "1234" OR ["nbc"] !== "4567" OR ["nbc"] !==  "8910"  {

There shouldn't be any double quotes in the field references and since your field values are numbers you shouldn't have any quotes there either. Also, !== is wrong. Try:

 if [nbc] != 1234 or [nbc] != 4567 or [nbc] != 8910  {

Shorter:

 if [nbc] not in [1234, 4567, 8910] {

Thanks @magnusbaeck . I've taken your advice and here is my latest config. still no dice. Here is my config. I checked the logstash log and i see no errors. I restarted logstash and touched the file to refresh it.

 input {
 file {
  path => "/path/to/file/*.json"
  start_position => "beginning"
  add_field => { "provider" => "xyz" }
  type => "abc"
  codec => "json"
  }
 }

 filter {
  if [provider] == "xyz" and [type] == "abc" {
  if [nbc] not in [12345, 6789]  {
  drop {  }
       }
    }
 }

 output {
  if [provider] == "xyz" and [type] == "abc" {
    file {
      path => "/path/to/file/newfile"
    }
  }
 }

I’ve taken your advice and here is my latest config. still no dice.

You mean you're getting data to the output file, just not what you expected?

if [nbc] not in [12345 6789] {

This isn't what I suggested. You're missing a comma. Did you try the slightly longer version?

Sorry @magnusbaeck, that was a typo on my end. My original config does have the values delimited by comma. Problem is, logstash is not creating the output file at all.

that was a typo on my end.

Given the availability of copy/paste I'm always curious how such typos can occur.

Problem is, logstash is not creating the output file at all.

Then you have an input problem and not a filter problem. Logstash is tailing the file and waiting for more output to be added to it. Unless you append data, add a new file, or clear the file's sincedb entry nothing will happen. See the file input's documentation for details.

@magnusbaeck I did copy and paste and then replaced the sensitive materials in order to avoid any security related concerns which is when the typo occurred .

Everyone makes mistakes and it shouldn't be frowned upon imo because it discourages people from asking questions in this forum.

Thank you for your help, I've tried the solutions you have provided already . I will do some more research and try to resolve it.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.