Less than or equal not working logstash filter and crashing

Hi Team,

I'm shipping logs from windows servers using filebeat. I have created a field which shows the reponse time in milliseconds.
I have created a field "Create_Indexing_response_time_ms" using grok and it works perfectly when I run logstash and the field is getting created in ES
However when I give the less than condition and mutate to create another field, I get the below error (pasted in the next reply)

input {
  beats {
    port => 5044
  }
}
filter
{
if [fields][logtype] == "Seal_Async_Logs"
{
grok {
   match => { "message" => ".*\[TIME]Indexing took: %{NUMBER:Create_Indexing_response_time_ms:int}"}
}
}
if [fields][logtype] == "Seal_Async_Logs" and [Create_Indexing_response_time_ms] < 2000 {
	mutate { add_field => { "Indexing_Time_taken" => "less than 2 seconds"}}
}
output {
  elasticsearch {
    hosts => ["http://10.150.59.17:9200"]
    index => "filebeat.test-%{+YYYY.MM}" 
  }
}

[ERROR][logstash.javapipeline ][main] Pipeline worker error, the pipeline will be stopped {:pipeline_id=>"main", :error=>"", :exception=>Java::JavaLang::NullPointerException, :backtrace=>["org.logstash.config.ir.compiler.EventCondition$Compiler$UnexpectedTypeException.(EventCondition.java:679)", "org.logstash.config.ir.compiler.EventCondition$Compiler.compare(EventCondition.java:453)", "org.logstash.config.ir.compiler.EventCondition$Compiler.lambda$compareFieldToConstant$11(EventCondition.java:444)", "org.logstash.config.ir.compiler.Utils.filterEvents(Utils.java:47)", "org.logstash.generated.CompiledDataset3.compute(Unknown Source)", "org.logstash.generated.CompiledDataset2.compute(Unknown Source)", "org.logstash.generated.CompiledDataset4.compute(Unknown Source)", "org.logstash.config.ir.CompiledPipeline$CompiledU

Please note that I have already converted the type to integer using the grok and I get the field as "Number" in elasticsearch

Kindly help pls

See the answer here.

Thanks @Badger but I already have the field created and the type reflects as "Number" so ideally logstash should not throw the exception when trying for a greater than expression.

Am I doing anything wrong ? Is it because I converted the type to integer using Grok instead of Mutate ? kindly help

As I said, you will get that exception with that stack trace if the field does not exist. If you get a _grokparsefailure then logstash will crash.

Thanks @Badger I find its weird that Im getting the field in elasticsearch what I have defined in grok pattern and was able to get the expected results using KQL however Im not able to proceed with that field for any further actions in logstash due to _grokparsefailure tag

this is how the field and message appears in each document

grok1

Pls help.

Anyone Any help on this would be great, thanks :slight_smile:

That message will not match the grok pattern. There is no space between the colon and the number.

I changed the pattern without space and tried refreshing the index pattern

Again, the field appears with the expected value however I still get the _grokparsefailre in the tag and due to this, im unable to use a mutate function using the same field

What is the setting for path.config (either on the command line or in logstash.yml)?

Thanks for quick reply.

I haven't done any changes to logstash.yml. Guess im using the default settings
PFB

I have placed the conf file in pipelines.yml file and run logstash as a service

below is how the conf is placed in pipelines.yml

# Available options:
#
#   # name of the pipeline
   - pipeline.id: SEALlogs
     path.config: "/ELK/logstash-oss-7.10.2-windows-x86_64/logstash-7.10.2/conf.d/test.conf"

Is there anything that I need to add in path.config in logstash.yml ? pls help

What exactly is the error message that you get with that configuration?

[ERROR][logstash.javapipeline ][main] Pipeline worker error, the pipeline will be stopped {:pipeline_id=>"main", :error=>"", :exception=>Java::JavaLang::NullPointerException, :backtrace=>["org.logstash.config.ir.compiler.EventCondition$Compiler$UnexpectedTypeException.(EventCondition.java:679)", "org.logstash.config.ir.compiler.EventCondition$Compiler.compare(EventCondition.java:453)", "org.logstash.config.ir.compiler.EventCondition$Compiler.lambda$compareFieldToConstant$11(EventCondition.java:444)", "org.logstash.config.ir.compiler.Utils.filterEvents(Utils.java:47)", "org.logstash.generated.CompiledDataset3.compute(Unknown Source)", "org.logstash.generated.CompiledDataset2.compute(Unknown Source)", "org.logstash.generated.CompiledDataset4.compute(Unknown Source)",

OK, so you have configured a pipeline called SEALlogs and are getting the error for a pipeline called main. You are not running the configuration you think you are.

I have corrected that and the I modified the grok pattern to field name in lower case and was able to get rid of the _grokparsefailure

Also I used to mutate convert to change type to string to number however when I try the

if [indexing] < 2000 {
mutate { add_field => { "timetaken" => "2secs" } }
}

Im getting the same pipeline error

here is the complete filter im using

filter
{
if [fields][logtype] == "Seal_Async_Logs"
{
grok {
   match => { "message" => ".*\[TIME]Indexing took:%{WORD:indexing}"}
}
}
if [fields][logtype] == "Seal_Async_Logs" {
mutate {
   convert => { "indexing" => "integer" }
}
}
if [indexing] < 2000 {
mutate { add_field => { "timetaken" => "2secs" } }
}
}

and below is the actual log

2021-09-16 09:59:27,799 INFO [ro.star.seal.session.archive.impl.DocumentServiceBean] (default task-22) [TIME]Indexing took:63 ms

The issue is not with grok anymore. I am getting the field created in ES as expected without _grokparsefailure

The problem is now not being able to add field when I give < or > symbol however it works when I give ==

Can someone pls help if anything else to be done before giving these expressions ?