Logstash filtr strange behavior

Hi guys.
I noticed that the convert_datatype option of the Dissect filter has a strange result.

Pipeline:

input {
    stdin { } 
}

filter {
    # split IPv4 address into octets
    dissect {
        mapping => {
            "message" => "%{octet_1}.%{octet_2}.%{octet_3}.%{octet_4}"
        }   
        convert_datatype => {
            "octet_1" => "int"
            "octet_2" => "int"
            "octet_3" => "int"
            "octet_4" => "int"
        }   
    }   
    
    # drop message if IP is private
    if ([octet_1] == 0
        or [octet_1] == 10
        or [octet_1] == 127 
        or ([octet_1] == 100 
            and ([octet_2] >= 64 and [octet_2] <= 127))
        or ([octet_1] == 172 
            and ([octet_2] >= 16 and [octet_2] <= 31))
        or ([octet_1] == 192 and [octet_2] == 168)
        or ([octet_1] == 169 and [octet_2] == 254)) {

        drop { } 
    }   
}

output {
    stdout { codec => rubydebug }
}

Result:

43.107.5.32
{
       "octet_4" => 32,
       "message" => "43.107.5.32",
      "@version" => "1",
    "@timestamp" => 2020-08-07T20:41:04.389Z,
          "host" => "elk",
       "octet_2" => 107,
       "octet_1" => 43,
       "octet_3" => 5
}
192.168.1.1
{
       "octet_4" => 1,
       "message" => "192.168.1.1",
      "@version" => "1",
    "@timestamp" => 2020-08-07T20:41:16.888Z,
          "host" => "elk",
       "octet_2" => 168,
       "octet_1" => 192,
       "octet_3" => 1
}

If use convert option of Mutate filter instead then all is Ok:
Pipeline:

input {
    stdin { } 
}

filter {
    #split IPv4 address into octets
    dissect {
        mapping => {
            "message" => "%{octet_1}.%{octet_2}.%{octet_3}.%{octet_4}"
        }
    }

    mutate {
        convert => {
            "octet_1" => "integer"
            "octet_2" => "integer"
            "octet_3" => "integer"
            "octet_4" => "integer"
        }
    }

    #drop message if IP is private
    if ([octet_1] == 0
        or [octet_1] == 10
        or [octet_1] == 127 
        or ([octet_1] == 100  
            and ([octet_2] >= 64 and [octet_2] <= 127))
        or ([octet_1] == 172  
            and ([octet_2] >= 16 and [octet_2] <= 31))
        or ([octet_1] == 192 and [octet_2] == 168)
        or ([octet_1] == 169 and [octet_2] == 254)) {

        drop { }
    }
}

output {
    stdout { codec => rubydebug }
}

Result:

43.107.5.32
{
       "octet_2" => 107,
       "message" => "43.107.5.32",
       "octet_3" => 5,
       "octet_4" => 32,
          "host" => "elk",
    "@timestamp" => 2020-08-07T20:48:05.082Z,
      "@version" => "1",
       "octet_1" => 43
}
192.168.1.1
0.0.0.0

This behavior is expected by design or is it a bug?

I cannot reproduce the problem. When I use convert_datatype the 192.168.1.1 message gets dropped. Is it possible you have a typo in your if statement in that version of the configuration?

By the way, I would do this using

    cidr {
        address => "%{message}"
        network => [ "127.0.0.0/8", "10.0.0.0/8", "0.0.0.0/8", "192.168.0.0/16" ]
        add_tag => [ "sharedIP" ]
    }
    if "sharedIP" in [tags] { drop {} }

I will leave it to you to fill in the rest of the CIDR network names that need to be dropped.

It is interesting. There are definitely no typos. The next two messages are the output of the debug mode.
Could it be because of the version of logstash? I have 7.1.1 installed.
Also thank you for the cidr plugin. Didn't know about this feature.

[max@elk logstash]$ sudo bin/logstash -f conf.d/test_convert_datatype.conf --log.level=debug --config.debug
WARNING: Could not find logstash.yml which is typically located in $LS_HOME/config or /etc/logstash. You can specify the path using --path.settings. Continuing using the defaults
Could not find log4j2 configuration at path /usr/share/logstash/config/log4j2.properties. Using default config which logs errors to the console
[DEBUG] 2020-08-08 14:58:18.782 [main] scaffold - Found module {:module_name=>"fb_apache", :directory=>"/usr/share/logstash/modules/fb_apache/configuration"}
[DEBUG] 2020-08-08 14:58:18.788 [main] registry - Adding plugin to the registry {:name=>"fb_apache", :type=>:modules, :class=>#<LogStash::Modules::Scaffold:0x1ffd6df2 @directory="/usr/share/logstash/modules/fb_apache/configuration", @module_name="fb_apache", @kibana_version_parts=["6", "0", "0"]>}
[DEBUG] 2020-08-08 14:58:18.789 [main] scaffold - Found module {:module_name=>"netflow", :directory=>"/usr/share/logstash/modules/netflow/configuration"}
[DEBUG] 2020-08-08 14:58:18.789 [main] registry - Adding plugin to the registry {:name=>"netflow", :type=>:modules, :class=>#<LogStash::Modules::Scaffold:0x63ec3626 @directory="/usr/share/logstash/modules/netflow/configuration", @module_name="netflow", @kibana_version_parts=["6", "0", "0"]>}
[DEBUG] 2020-08-08 14:58:19.366 [LogStash::Runner] runner - -------- Logstash Settings (* means modified) ---------
[DEBUG] 2020-08-08 14:58:19.367 [LogStash::Runner] runner - node.name: "elk"
[DEBUG] 2020-08-08 14:58:19.367 [LogStash::Runner] runner - *path.config: "conf.d/test_convert_datatype.conf"
[DEBUG] 2020-08-08 14:58:19.367 [LogStash::Runner] runner - path.data: "/usr/share/logstash/data"
[DEBUG] 2020-08-08 14:58:19.367 [LogStash::Runner] runner - modules.cli: []
[DEBUG] 2020-08-08 14:58:19.367 [LogStash::Runner] runner - modules: []
[DEBUG] 2020-08-08 14:58:19.367 [LogStash::Runner] runner - modules_list: []
[DEBUG] 2020-08-08 14:58:19.367 [LogStash::Runner] runner - modules_variable_list: []
[DEBUG] 2020-08-08 14:58:19.367 [LogStash::Runner] runner - modules_setup: false
[DEBUG] 2020-08-08 14:58:19.367 [LogStash::Runner] runner - config.test_and_exit: false
[DEBUG] 2020-08-08 14:58:19.367 [LogStash::Runner] runner - config.reload.automatic: false
[DEBUG] 2020-08-08 14:58:19.367 [LogStash::Runner] runner - config.reload.interval: 3000000000
[DEBUG] 2020-08-08 14:58:19.367 [LogStash::Runner] runner - config.support_escapes: false
[DEBUG] 2020-08-08 14:58:19.367 [LogStash::Runner] runner - config.field_reference.parser: "STRICT"
[DEBUG] 2020-08-08 14:58:19.367 [LogStash::Runner] runner - metric.collect: true
[DEBUG] 2020-08-08 14:58:19.367 [LogStash::Runner] runner - pipeline.id: "main"
[DEBUG] 2020-08-08 14:58:19.367 [LogStash::Runner] runner - pipeline.system: false
[DEBUG] 2020-08-08 14:58:19.367 [LogStash::Runner] runner - pipeline.workers: 2
[DEBUG] 2020-08-08 14:58:19.367 [LogStash::Runner] runner - pipeline.batch.size: 125
[DEBUG] 2020-08-08 14:58:19.367 [LogStash::Runner] runner - pipeline.batch.delay: 50
[DEBUG] 2020-08-08 14:58:19.367 [LogStash::Runner] runner - pipeline.unsafe_shutdown: false
[DEBUG] 2020-08-08 14:58:19.367 [LogStash::Runner] runner - pipeline.java_execution: true
[DEBUG] 2020-08-08 14:58:19.368 [LogStash::Runner] runner - pipeline.reloadable: true
[DEBUG] 2020-08-08 14:58:19.368 [LogStash::Runner] runner - path.plugins: []
[DEBUG] 2020-08-08 14:58:19.368 [LogStash::Runner] runner - *config.debug: true (default: false)
[DEBUG] 2020-08-08 14:58:19.368 [LogStash::Runner] runner - *log.level: "debug" (default: "info")
[DEBUG] 2020-08-08 14:58:19.368 [LogStash::Runner] runner - version: false
[DEBUG] 2020-08-08 14:58:19.368 [LogStash::Runner] runner - help: false
[DEBUG] 2020-08-08 14:58:19.368 [LogStash::Runner] runner - log.format: "plain"
[DEBUG] 2020-08-08 14:58:19.368 [LogStash::Runner] runner - http.host: "127.0.0.1"
[DEBUG] 2020-08-08 14:58:19.368 [LogStash::Runner] runner - http.port: 9600..9700
[DEBUG] 2020-08-08 14:58:19.368 [LogStash::Runner] runner - http.environment: "production"
[DEBUG] 2020-08-08 14:58:19.368 [LogStash::Runner] runner - queue.type: "memory"
[DEBUG] 2020-08-08 14:58:19.368 [LogStash::Runner] runner - queue.drain: false
[DEBUG] 2020-08-08 14:58:19.368 [LogStash::Runner] runner - queue.page_capacity: 67108864
[DEBUG] 2020-08-08 14:58:19.368 [LogStash::Runner] runner - queue.max_bytes: 1073741824
[DEBUG] 2020-08-08 14:58:19.368 [LogStash::Runner] runner - queue.max_events: 0
[DEBUG] 2020-08-08 14:58:19.368 [LogStash::Runner] runner - queue.checkpoint.acks: 1024
[DEBUG] 2020-08-08 14:58:19.372 [LogStash::Runner] runner - queue.checkpoint.writes: 1024
[DEBUG] 2020-08-08 14:58:19.372 [LogStash::Runner] runner - queue.checkpoint.interval: 1000
[DEBUG] 2020-08-08 14:58:19.372 [LogStash::Runner] runner - queue.checkpoint.retry: false
[DEBUG] 2020-08-08 14:58:19.372 [LogStash::Runner] runner - dead_letter_queue.enable: false
[DEBUG] 2020-08-08 14:58:19.373 [LogStash::Runner] runner - dead_letter_queue.max_bytes: 1073741824
[DEBUG] 2020-08-08 14:58:19.373 [LogStash::Runner] runner - slowlog.threshold.warn: -1
[DEBUG] 2020-08-08 14:58:19.373 [LogStash::Runner] runner - slowlog.threshold.info: -1
[DEBUG] 2020-08-08 14:58:19.373 [LogStash::Runner] runner - slowlog.threshold.debug: -1
[DEBUG] 2020-08-08 14:58:19.373 [LogStash::Runner] runner - slowlog.threshold.trace: -1
[DEBUG] 2020-08-08 14:58:19.373 [LogStash::Runner] runner - keystore.classname: "org.logstash.secret.store.backend.JavaKeyStore"
[DEBUG] 2020-08-08 14:58:19.373 [LogStash::Runner] runner - keystore.file: "/usr/share/logstash/config/logstash.keystore"
[DEBUG] 2020-08-08 14:58:19.373 [LogStash::Runner] runner - path.queue: "/usr/share/logstash/data/queue"
[DEBUG] 2020-08-08 14:58:19.373 [LogStash::Runner] runner - path.dead_letter_queue: "/usr/share/logstash/data/dead_letter_queue"
[DEBUG] 2020-08-08 14:58:19.373 [LogStash::Runner] runner - path.settings: "/usr/share/logstash/config"
[DEBUG] 2020-08-08 14:58:19.373 [LogStash::Runner] runner - path.logs: "/usr/share/logstash/logs"
[DEBUG] 2020-08-08 14:58:19.373 [LogStash::Runner] runner - xpack.management.enabled: false
...
[DEBUG] 2020-08-08 14:58:19.374 [LogStash::Runner] runner - xpack.monitoring.collection.config.enabled: true
[DEBUG] 2020-08-08 14:58:19.374 [LogStash::Runner] runner - node.uuid: ""
[DEBUG] 2020-08-08 14:58:19.374 [LogStash::Runner] runner - --------------- Logstash Settings -------------------
[WARN ] 2020-08-08 14:58:19.448 [LogStash::Runner] multilocal - Ignoring the 'pipelines.yml' file because modules or command line options are specified
[INFO ] 2020-08-08 14:58:19.459 [LogStash::Runner] runner - Starting Logstash {"logstash.version"=>"7.1.1"}
[DEBUG] 2020-08-08 14:58:19.514 [LogStash::Runner] agent - Setting up metric collection
[DEBUG] 2020-08-08 14:58:19.605 [LogStash::Runner] os - Starting {:polling_interval=>5, :polling_timeout=>120}
[DEBUG] 2020-08-08 14:58:19.912 [LogStash::Runner] jvm - Starting {:polling_interval=>5, :polling_timeout=>120}
[DEBUG] 2020-08-08 14:58:20.044 [LogStash::Runner] jvm - collector name {:name=>"ParNew"}
[DEBUG] 2020-08-08 14:58:20.053 [LogStash::Runner] jvm - collector name {:name=>"ConcurrentMarkSweep"}
[DEBUG] 2020-08-08 14:58:20.086 [LogStash::Runner] persistentqueue - Starting {:polling_interval=>5, :polling_timeout=>120}
[DEBUG] 2020-08-08 14:58:20.121 [LogStash::Runner] deadletterqueue - Starting {:polling_interval=>5, :polling_timeout=>120}
[DEBUG] 2020-08-08 14:58:20.212 [Ruby-0-Thread-1: /usr/share/logstash/lib/bootstrap/environment.rb:6] agent - Starting agent
[DEBUG] 2020-08-08 14:58:20.318 [Ruby-0-Thread-1: /usr/share/logstash/lib/bootstrap/environment.rb:6] configpathloader - Skipping the following files while reading config since they don't match the specified glob pattern {:files=>["/usr/share/logstash/conf.d/metric_value.conf", "/usr/share/logstash/conf.d/test.conf", "/usr/share/logstash/conf.d/test_convert.conf"]}
[DEBUG] 2020-08-08 14:58:20.320 [Ruby-0-Thread-1: /usr/share/logstash/lib/bootstrap/environment.rb:6] configpathloader - Reading config file {:config_file=>"/usr/share/logstash/conf.d/test_convert_datatype.conf"}
[DEBUG] 2020-08-08 14:58:20.359 [Ruby-0-Thread-1: /usr/share/logstash/lib/bootstrap/environment.rb:6] pipelineconfig - -------- Logstash Config ---------
[DEBUG] 2020-08-08 14:58:20.360 [Ruby-0-Thread-1: /usr/share/logstash/lib/bootstrap/environment.rb:6] pipelineconfig - Config from source {:source=>LogStash::Config::Source::Local, :pipeline_id=>:main}
[DEBUG] 2020-08-08 14:58:20.362 [Ruby-0-Thread-1: /usr/share/logstash/lib/bootstrap/environment.rb:6] pipelineconfig - Config string {:protocol=>"file", :id=>"/usr/share/logstash/conf.d/test_convert_datatype.conf"}
[DEBUG] 2020-08-08 14:58:20.365 [Ruby-0-Thread-1: /usr/share/logstash/lib/bootstrap/environment.rb:6] pipelineconfig - 

input {
    stdin { } 
}


filter {
    #split IPv4 address into octets
    dissect {
        mapping => {
            "message" => "%{octet_1}.%{octet_2}.%{octet_3}.%{octet_4}"
        }
        convert_datatype => {
            "octet_1" => "int"
            "octet_2" => "int"
            "octet_3" => "int"
            "octet_4" => "int"
        }
    }

    #drop message if IP is private
    if ([octet_1] == 0
        or [octet_1] == 10
        or [octet_1] == 127 
        or ([octet_1] == 100 
            and ([octet_2] >= 64 and [octet_2] <= 127))
        or ([octet_1] == 172 
            and ([octet_2] >= 16 and [octet_2] <= 31))
        or ([octet_1] == 192 and [octet_2] == 168)
        or ([octet_1] == 169 and [octet_2] == 254)) {

        drop { } 
    }   
}


output {
    stdout { codec => rubydebug }
}


[DEBUG] 2020-08-08 14:58:20.367 [Ruby-0-Thread-1: /usr/share/logstash/lib/bootstrap/environment.rb:6] pipelineconfig - Merged config
...
[DEBUG] 2020-08-08 14:58:20.409 [Ruby-0-Thread-1: /usr/share/logstash/lib/bootstrap/environment.rb:6] agent - Converging pipelines state {:actions_count=>1}
[DEBUG] 2020-08-08 14:58:20.424 [Converge PipelineAction::Create<main>] agent - Executing action {:action=>LogStash::PipelineAction::Create/pipeline_id:main}
[DEBUG] 2020-08-08 14:58:24.877 [Converge PipelineAction::Create<main>] registry - On demand adding plugin to the registry {:name=>"stdin", :type=>"input", :class=>LogStash::Inputs::Stdin}
[DEBUG] 2020-08-08 14:58:25.216 [Converge PipelineAction::Create<main>] registry - On demand adding plugin to the registry {:name=>"line", :type=>"codec", :class=>LogStash::Codecs::Line}
[DEBUG] 2020-08-08 14:58:25.268 [pool-3-thread-2] jvm - collector name {:name=>"ParNew"}
[DEBUG] 2020-08-08 14:58:25.269 [pool-3-thread-2] jvm - collector name {:name=>"ConcurrentMarkSweep"}
[DEBUG] 2020-08-08 14:58:25.304 [Converge PipelineAction::Create<main>] line - config LogStash::Codecs::Line/@id = "line_3be31be5-19b5-4cf2-8e0c-88d94560b42d"
[DEBUG] 2020-08-08 14:58:25.304 [Converge PipelineAction::Create<main>] line - config LogStash::Codecs::Line/@enable_metric = true
[DEBUG] 2020-08-08 14:58:25.305 [Converge PipelineAction::Create<main>] line - config LogStash::Codecs::Line/@charset = "UTF-8"
[DEBUG] 2020-08-08 14:58:25.305 [Converge PipelineAction::Create<main>] line - config LogStash::Codecs::Line/@delimiter = "\n"
[DEBUG] 2020-08-08 14:58:25.320 [Converge PipelineAction::Create<main>] stdin - config LogStash::Inputs::Stdin/@id = "cb38009635b663af637630806e87d4908778ecf8201c8db6165520f798dbc775"
[DEBUG] 2020-08-08 14:58:25.321 [Converge PipelineAction::Create<main>] stdin - config LogStash::Inputs::Stdin/@enable_metric = true
[DEBUG] 2020-08-08 14:58:25.330 [Converge PipelineAction::Create<main>] stdin - config LogStash::Inputs::Stdin/@codec = <LogStash::Codecs::Line id=>"line_3be31be5-19b5-4cf2-8e0c-88d94560b42d", enable_metric=>true, charset=>"UTF-8", delimiter=>"\n">
[DEBUG] 2020-08-08 14:58:25.332 [Converge PipelineAction::Create<main>] stdin - config LogStash::Inputs::Stdin/@add_field = {}
[DEBUG] 2020-08-08 14:58:25.377 [Converge PipelineAction::Create<main>] registry - On demand adding plugin to the registry {:name=>"dissect", :type=>"filter", :class=>LogStash::Filters::Dissect}
[DEBUG] 2020-08-08 14:58:25.395 [Converge PipelineAction::Create<main>] dissect - config LogStash::Filters::Dissect/@convert_datatype = {"octet_1"=>"int", "octet_2"=>"int", "octet_3"=>"int", "octet_4"=>"int"}
[DEBUG] 2020-08-08 14:58:25.395 [Converge PipelineAction::Create<main>] dissect - config LogStash::Filters::Dissect/@mapping = {"message"=>"%{octet_1}.%{octet_2}.%{octet_3}.%{octet_4}"}
[DEBUG] 2020-08-08 14:58:25.395 [Converge PipelineAction::Create<main>] dissect - config LogStash::Filters::Dissect/@id = "74f2767c695b2552e827a404b71a6eeba252504188d802356c888732a73c50ff"
[DEBUG] 2020-08-08 14:58:25.395 [Converge PipelineAction::Create<main>] dissect - config LogStash::Filters::Dissect/@enable_metric = true
[DEBUG] 2020-08-08 14:58:25.395 [Converge PipelineAction::Create<main>] dissect - config LogStash::Filters::Dissect/@add_tag = []
[DEBUG] 2020-08-08 14:58:25.396 [Converge PipelineAction::Create<main>] dissect - config LogStash::Filters::Dissect/@remove_tag = []
[DEBUG] 2020-08-08 14:58:25.396 [Converge PipelineAction::Create<main>] dissect - config LogStash::Filters::Dissect/@add_field = {}
[DEBUG] 2020-08-08 14:58:25.396 [Converge PipelineAction::Create<main>] dissect - config LogStash::Filters::Dissect/@remove_field = []
[DEBUG] 2020-08-08 14:58:25.396 [Converge PipelineAction::Create<main>] dissect - config LogStash::Filters::Dissect/@periodic_flush = false
[DEBUG] 2020-08-08 14:58:25.396 [Converge PipelineAction::Create<main>] dissect - config LogStash::Filters::Dissect/@tag_on_failure = ["_dissectfailure"]
[DEBUG] 2020-08-08 14:58:25.408 [Converge PipelineAction::Create<main>] registry - On demand adding plugin to the registry {:name=>"drop", :type=>"filter", :class=>LogStash::Filters::Drop}
[DEBUG] 2020-08-08 14:58:25.421 [Converge PipelineAction::Create<main>] drop - config LogStash::Filters::Drop/@id = "d588837e65190f1decf82d06e322a66f5370b82b710beaef8e9b9397fc0b8bf4"
[DEBUG] 2020-08-08 14:58:25.422 [Converge PipelineAction::Create<main>] drop - config LogStash::Filters::Drop/@enable_metric = true
[DEBUG] 2020-08-08 14:58:25.422 [Converge PipelineAction::Create<main>] drop - config LogStash::Filters::Drop/@add_tag = []
[DEBUG] 2020-08-08 14:58:25.422 [Converge PipelineAction::Create<main>] drop - config LogStash::Filters::Drop/@remove_tag = []
[DEBUG] 2020-08-08 14:58:25.422 [Converge PipelineAction::Create<main>] drop - config LogStash::Filters::Drop/@add_field = {}
[DEBUG] 2020-08-08 14:58:25.422 [Converge PipelineAction::Create<main>] drop - config LogStash::Filters::Drop/@remove_field = []
[DEBUG] 2020-08-08 14:58:25.422 [Converge PipelineAction::Create<main>] drop - config LogStash::Filters::Drop/@periodic_flush = false
[DEBUG] 2020-08-08 14:58:25.422 [Converge PipelineAction::Create<main>] drop - config LogStash::Filters::Drop/@percentage = 100
[DEBUG] 2020-08-08 14:58:25.426 [Converge PipelineAction::Create<main>] registry - On demand adding plugin to the registry {:name=>"stdout", :type=>"output", :class=>LogStash::Outputs::Stdout}
[DEBUG] 2020-08-08 14:58:25.445 [Converge PipelineAction::Create<main>] registry - On demand adding plugin to the registry {:name=>"rubydebug", :type=>"codec", :class=>LogStash::Codecs::RubyDebug}
[DEBUG] 2020-08-08 14:58:25.456 [Converge PipelineAction::Create<main>] rubydebug - config LogStash::Codecs::RubyDebug/@id = "rubydebug_8c004552-a209-4773-9a25-46fd7e488a63"
[DEBUG] 2020-08-08 14:58:25.456 [Converge PipelineAction::Create<main>] rubydebug - config LogStash::Codecs::RubyDebug/@enable_metric = true
[DEBUG] 2020-08-08 14:58:25.456 [Converge PipelineAction::Create<main>] rubydebug - config LogStash::Codecs::RubyDebug/@metadata = false
[DEBUG] 2020-08-08 14:58:26.025 [Converge PipelineAction::Create<main>] stdout - config LogStash::Outputs::Stdout/@codec = <LogStash::Codecs::RubyDebug id=>"rubydebug_8c004552-a209-4773-9a25-46fd7e488a63", enable_metric=>true, metadata=>false>
[DEBUG] 2020-08-08 14:58:26.025 [Converge PipelineAction::Create<main>] stdout - config LogStash::Outputs::Stdout/@id = "6c837b12513d0942fb7276e693c92ddea634ef0c4de534933a125695ead4cf32"
[DEBUG] 2020-08-08 14:58:26.026 [Converge PipelineAction::Create<main>] stdout - config LogStash::Outputs::Stdout/@enable_metric = true
[DEBUG] 2020-08-08 14:58:26.026 [Converge PipelineAction::Create<main>] stdout - config LogStash::Outputs::Stdout/@workers = 1
[DEBUG] 2020-08-08 14:58:26.038 [Converge PipelineAction::Create<main>] JavaBasePipelineExt - Compiled pipeline code for pipeline main : **GRAPH**
Vertices: 6 Edges: 6
----------------------P[input-stdin{}|[str]pipeline:2:5:```
stdin { }
```] -> __QUEUE__
__QUEUE__ -> P[filter-dissect{"mapping"=>{"message"=>"%{octet_1}.%{octet_2}.%{octet_3}.%{octet_4}"}, "convert_datatype"=>{"octet_1"=>"int", "octet_2"=>"int", "octet_3"=>"int", "octet_4"=>"int"}}|[str]pipeline:8:5:```
dissect {
        mapping => {
            "message" => "%{octet_1}.%{octet_2}.%{octet_3}.%{octet_4}"
        }
        convert_datatype => {
            "octet_1" => "int"
            "octet_2" => "int"
            "octet_3" => "int"
            "octet_4" => "int"
        }
    }
```]
P[filter-dissect{"mapping"=>{"message"=>"%{octet_1}.%{octet_2}.%{octet_3}.%{octet_4}"}, "convert_datatype"=>{"octet_1"=>"int", "octet_2"=>"int", "octet_3"=>"int", "octet_4"=>"int"}}|[str]pipeline:8:5:```
dissect {
        mapping => {
            "message" => "%{octet_1}.%{octet_2}.%{octet_3}.%{octet_4}"
        }
        convert_datatype => {
            "octet_1" => "int"
            "octet_2" => "int"
            "octet_3" => "int"
            "octet_4" => "int"
        }
    }
```] -> [if ((event.getField('[octet_1]')==0)||((event.getField('[octet_1]')==10)||((event.getField('[octet_1]')==127)||(((event.getField('[octet_1]')==100)&&((event.getField('[octet_2]')>=64)&&(event.getField('[octet_2]')<=127)))||(((event.getField('[octet_1]')==172)&&((event.getField('[octet_2]')>=16)&&(event.getField('[octet_2]')<=31)))||(((event.getField('[octet_1]')==192)&&(event.getField('[octet_2]')==168))||((event.getField('[octet_1]')==169)&&(event.getField('[octet_2]')==254))))))))]
[if ((event.getField('[octet_1]')==0)||((event.getField('[octet_1]')==10)||((event.getField('[octet_1]')==127)||(((event.getField('[octet_1]')==100)&&((event.getField('[octet_2]')>=64)&&(event.getField('[octet_2]')<=127)))||(((event.getField('[octet_1]')==172)&&((event.getField('[octet_2]')>=16)&&(event.getField('[octet_2]')<=31)))||(((event.getField('[octet_1]')==192)&&(event.getField('[octet_2]')==168))||((event.getField('[octet_1]')==169)&&(event.getField('[octet_2]')==254))))))))] -|true|-> P[filter-drop{}|[str]pipeline:31:9:```
drop { }
```]
[if ((event.getField('[octet_1]')==0)||((event.getField('[octet_1]')==10)||((event.getField('[octet_1]')==127)||(((event.getField('[octet_1]')==100)&&((event.getField('[octet_2]')>=64)&&(event.getField('[octet_2]')<=127)))||(((event.getField('[octet_1]')==172)&&((event.getField('[octet_2]')>=16)&&(event.getField('[octet_2]')<=31)))||(((event.getField('[octet_1]')==192)&&(event.getField('[octet_2]')==168))||((event.getField('[octet_1]')==169)&&(event.getField('[octet_2]')==254))))))))] -|false|-> P[output-stdout{"codec"=>"rubydebug"}|[str]pipeline:37:5:```
stdout { codec => rubydebug }
```]
P[filter-drop{}|[str]pipeline:31:9:```
drop { }
```] -> P[output-stdout{"codec"=>"rubydebug"}|[str]pipeline:37:5:```
stdout { codec => rubydebug }
```]
**GRAPH**
[DEBUG] 2020-08-08 14:58:26.083 [Converge PipelineAction::Create<main>] javapipeline - Starting pipeline {:pipeline_id=>"main"}
[INFO ] 2020-08-08 14:58:26.174 [[main]-pipeline-manager] javapipeline - Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>2, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>250, :thread=>"#<Thread:0x41055b7b run>"}
[INFO ] 2020-08-08 14:58:26.360 [[main]-pipeline-manager] javapipeline - Pipeline started {"pipeline.id"=>"main"}
[DEBUG] 2020-08-08 14:58:26.383 [Converge PipelineAction::Create<main>] javapipeline - Pipeline started successfully {:pipeline_id=>"main", :thread=>"#<Thread:0x41055b7b run>"}
The stdin plugin is now waiting for input:
[DEBUG] 2020-08-08 14:58:26.563 [logstash-pipeline-flush] PeriodicFlush - Pushing flush onto pipeline.
[INFO ] 2020-08-08 14:58:26.631 [Ruby-0-Thread-1: /usr/share/logstash/lib/bootstrap/environment.rb:6] agent - Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[DEBUG] 2020-08-08 14:58:26.673 [Api Webserver] agent - Starting puma
[DEBUG] 2020-08-08 14:58:26.695 [Api Webserver] agent - Trying to start WebServer {:port=>9600}
[DEBUG] 2020-08-08 14:58:26.840 [Api Webserver] service - [api-service] start
[DEBUG] 2020-08-08 14:58:26.278 [[main]>worker1] CompiledPipeline - Compiled filter
 P[filter-dissect{"mapping"=>{"message"=>"%{octet_1}.%{octet_2}.%{octet_3}.%{octet_4}"}, "convert_datatype"=>{"octet_1"=>"int", "octet_2"=>"int", "octet_3"=>"int", "octet_4"=>"int"}}|[str]pipeline:8:5:```
dissect {
        mapping => {
            "message" => "%{octet_1}.%{octet_2}.%{octet_3}.%{octet_4}"
        }
        convert_datatype => {
            "octet_1" => "int"
            "octet_2" => "int"
            "octet_3" => "int"
            "octet_4" => "int"
        }
    }
```] 
 into 
 org.logstash.config.ir.compiler.ComputeStepSyntaxElement@af2cdd9f
[DEBUG] 2020-08-08 14:58:26.271 [[main]>worker0] CompiledPipeline - Compiled filter
 P[filter-dissect{"mapping"=>{"message"=>"%{octet_1}.%{octet_2}.%{octet_3}.%{octet_4}"}, "convert_datatype"=>{"octet_1"=>"int", "octet_2"=>"int", "octet_3"=>"int", "octet_4"=>"int"}}|[str]pipeline:8:5:```
dissect {
        mapping => {
            "message" => "%{octet_1}.%{octet_2}.%{octet_3}.%{octet_4}"
        }
        convert_datatype => {
            "octet_1" => "int"
            "octet_2" => "int"
            "octet_3" => "int"
            "octet_4" => "int"
        }
    }
```] 
 into 
 org.logstash.config.ir.compiler.ComputeStepSyntaxElement@af2cdd9f
[INFO ] 2020-08-08 14:58:27.268 [Api Webserver] agent - Successfully started Logstash API endpoint {:port=>9600}
[DEBUG] 2020-08-08 14:58:27.431 [[main]>worker1] CompiledPipeline - Compiled conditional
 [if ((event.getField('[octet_1]')==0)||((event.getField('[octet_1]')==10)||((event.getField('[octet_1]')==127)||(((event.getField('[octet_1]')==100)&&((event.getField('[octet_2]')>=64)&&(event.getField('[octet_2]')<=127)))||(((event.getField('[octet_1]')==172)&&((event.getField('[octet_2]')>=16)&&(event.getField('[octet_2]')<=31)))||(((event.getField('[octet_1]')==192)&&(event.getField('[octet_2]')==168))||((event.getField('[octet_1]')==169)&&(event.getField('[octet_2]')==254))))))))] 
 into 
 org.logstash.config.ir.compiler.ComputeStepSyntaxElement@9555b4b7
[DEBUG] 2020-08-08 14:58:27.504 [[main]>worker0] CompiledPipeline - Compiled conditional
 [if ((event.getField('[octet_1]')==0)||((event.getField('[octet_1]')==10)||((event.getField('[octet_1]')==127)||(((event.getField('[octet_1]')==100)&&((event.getField('[octet_2]')>=64)&&(event.getField('[octet_2]')<=127)))||(((event.getField('[octet_1]')==172)&&((event.getField('[octet_2]')>=16)&&(event.getField('[octet_2]')<=31)))||(((event.getField('[octet_1]')==192)&&(event.getField('[octet_2]')==168))||((event.getField('[octet_1]')==169)&&(event.getField('[octet_2]')==254))))))))] 
 into 
 org.logstash.config.ir.compiler.ComputeStepSyntaxElement@9555b4b7
[DEBUG] 2020-08-08 14:58:27.599 [[main]>worker1] CompiledPipeline - Compiled filter
 P[filter-drop{}|[str]pipeline:31:9:```
drop { }
```] 
 into 
 org.logstash.config.ir.compiler.ComputeStepSyntaxElement@28522e30
[DEBUG] 2020-08-08 14:58:27.709 [[main]>worker0] CompiledPipeline - Compiled filter
 P[filter-drop{}|[str]pipeline:31:9:```
drop { }
```] 
 into 
 org.logstash.config.ir.compiler.ComputeStepSyntaxElement@28522e30
[DEBUG] 2020-08-08 14:58:27.774 [[main]>worker0] CompiledPipeline - Compiled output
 P[output-stdout{"codec"=>"rubydebug"}|[str]pipeline:37:5:```
stdout { codec => rubydebug }
```] 
 into 
 org.logstash.config.ir.compiler.ComputeStepSyntaxElement@d2b6e9b7
[DEBUG] 2020-08-08 14:58:27.848 [[main]>worker1] CompiledPipeline - Compiled output
 P[output-stdout{"codec"=>"rubydebug"}|[str]pipeline:37:5:```
stdout { codec => rubydebug }
```] 
 into 
 org.logstash.config.ir.compiler.ComputeStepSyntaxElement@d2b6e9b7
...
192.16[DEBUG] 2020-08-08 14:58:35.333 [pool-3-thread-2] jvm - collector name {:name=>"ParNew"}
[DEBUG] 2020-08-08 14:58:35.334 [pool-3-thread-2] jvm - collector name {:name=>"ConcurrentMarkSweep"}
8.1.1[DEBUG] 2020-08-08 14:58:36.507 [logstash-pipeline-flush] PeriodicFlush - Pushing flush onto pipeline.

[DEBUG] 2020-08-08 14:58:36.810 [[main]>worker1] Dissector - Event before dissection {"event"=>{"@version"=>"1", "message"=>"192.168.1.1", "@timestamp"=>2020-08-08T11:58:36.677Z, "host"=>"elk"}}
[DEBUG] 2020-08-08 14:58:36.819 [[main]>worker1] Dissector - Event after dissection {"event"=>{"@version"=>"1", "octet_1"=>192, "@timestamp"=>2020-08-08T11:58:36.677Z, "octet_2"=>168, "octet_4"=>1, "message"=>"192.168.1.1", "host"=>"elk", "octet_3"=>1}}
/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/awesome_print-1.7.0/lib/awesome_print/formatters/base_formatter.rb:31: warning: constant ::Fixnum is deprecated
{
      "@version" => "1",
       "octet_1" => 192,
    "@timestamp" => 2020-08-08T11:58:36.677Z,
       "octet_2" => 168,
       "octet_4" => 1,
       "message" => "192.168.1.1",
          "host" => "elk",
       "octet_3" => 1
}
[DEBUG] 2020-08-08 14:58:40.397 [pool-3-thread-2] jvm - collector name {:name=>"ParNew"}
[DEBUG] 2020-08-08 14:58:40.401 [pool-3-thread-2] jvm - collector name {:name=>"ConcurrentMarkSweep"}

Sorry did not fit in two messages. One more :slight_smile:

^C[WARN ] 2020-08-08 14:58:43.439 [SIGINT handler] runner - SIGINT received. Shutting down.
[DEBUG] 2020-08-08 14:58:43.478 [LogStash::Runner] os - Stopping
[DEBUG] 2020-08-08 14:58:43.508 [LogStash::Runner] jvm - Stopping
[DEBUG] 2020-08-08 14:58:43.509 [LogStash::Runner] persistentqueue - Stopping
[DEBUG] 2020-08-08 14:58:43.509 [LogStash::Runner] deadletterqueue - Stopping
[DEBUG] 2020-08-08 14:58:43.571 [LogStash::Runner] agent - Shutting down all pipelines {:pipelines_count=>1}
[DEBUG] 2020-08-08 14:58:43.576 [LogStash::Runner] agent - Converging pipelines state {:actions_count=>1}
[DEBUG] 2020-08-08 14:58:43.580 [Converge PipelineAction::Stop<main>] agent - Executing action {:action=>LogStash::PipelineAction::Stop/pipeline_id:main}
[DEBUG] 2020-08-08 14:58:43.663 [Converge PipelineAction::Stop<main>] javapipeline - Closing inputs {:pipeline_id=>"main", :thread=>"#<Thread:0x41055b7b sleep>"}
[DEBUG] 2020-08-08 14:58:43.666 [Converge PipelineAction::Stop<main>] stdin - Stopping {:plugin=>"LogStash::Inputs::Stdin"}
[DEBUG] 2020-08-08 14:58:43.671 [[main]<stdin] stdin - Stopping {:plugin=>"LogStash::Inputs::Stdin"}
[DEBUG] 2020-08-08 14:58:43.674 [Converge PipelineAction::Stop<main>] javapipeline - Closed inputs {:pipeline_id=>"main", :thread=>"#<Thread:0x41055b7b sleep>"}
[DEBUG] 2020-08-08 14:58:43.677 [Converge PipelineAction::Stop<main>] javapipeline - Closing inputs {:pipeline_id=>"main", :thread=>"#<Thread:0x41055b7b sleep>"}
[DEBUG] 2020-08-08 14:58:43.682 [[main]<stdin] stdin - Closing {:plugin=>"LogStash::Inputs::Stdin"}
[DEBUG] 2020-08-08 14:58:43.688 [[main]-pipeline-manager] javapipeline - Input plugins stopped! Will shutdown filter/output workers. {:pipeline_id=>"main", :thread=>"#<Thread:0x41055b7b run>"}
[DEBUG] 2020-08-08 14:58:43.703 [[main]-pipeline-manager] javapipeline - Shutdown waiting for worker thread {:pipeline_id=>"main", :thread=>"#<Thread:0x6ee43cfd run>"}
[DEBUG] 2020-08-08 14:58:43.760 [[main]-pipeline-manager] javapipeline - Shutdown waiting for worker thread {:pipeline_id=>"main", :thread=>"#<Thread:0x30c7eea8 dead>"}
[DEBUG] 2020-08-08 14:58:43.762 [Converge PipelineAction::Stop<main>] javapipeline - Worker closed {:pipeline_id=>"main", :thread=>"#<Thread:0x41055b7b run>"}
[DEBUG] 2020-08-08 14:58:43.780 [[main]-pipeline-manager] dissect - Closing {:plugin=>"LogStash::Filters::Dissect"}
[DEBUG] 2020-08-08 14:58:43.785 [[main]-pipeline-manager] drop - Closing {:plugin=>"LogStash::Filters::Drop"}
[DEBUG] 2020-08-08 14:58:43.786 [[main]-pipeline-manager] stdout - Closing {:plugin=>"LogStash::Outputs::Stdout"}
[DEBUG] 2020-08-08 14:58:43.790 [[main]-pipeline-manager] javapipeline - Pipeline has been shutdown {:pipeline_id=>"main", :thread=>"#<Thread:0x41055b7b run>"}
[INFO ] 2020-08-08 14:58:43.803 [Converge PipelineAction::Stop<main>] javapipeline - Pipeline terminated {"pipeline.id"=>"main"}
[INFO ] 2020-08-08 14:58:43.821 [LogStash::Runner] runner - Logstash shut down.

The compiled version of the conditional looks wrong to me!

Yes, it looks like it is. But I ran the convert option of Mutate filter in debug mode and the compiled version of the conditional looks the same:

[DEBUG] 2020-08-08 22:30:10.161 [Converge PipelineAction::Create<main>] JavaBasePipelineExt - Compiled pipeline code for pipeline main : **GRAPH**
Vertices: 7 Edges: 7
----------------------P[input-stdin{}|[str]pipeline:2:5:```
stdin { }
```] -> __QUEUE__
__QUEUE__ -> P[filter-dissect{"mapping"=>{"message"=>"%{octet_1}.%{octet_2}.%{octet_3}.%{octet_4}"}}|[str]pipeline:8:5:```
dissect {
        mapping => {
            "message" => "%{octet_1}.%{octet_2}.%{octet_3}.%{octet_4}"
        }   
    }
```]
P[filter-dissect{"mapping"=>{"message"=>"%{octet_1}.%{octet_2}.%{octet_3}.%{octet_4}"}}|[str]pipeline:8:5:```
dissect {
        mapping => {
            "message" => "%{octet_1}.%{octet_2}.%{octet_3}.%{octet_4}"
        }   
    }
```] -> P[filter-mutate{"convert"=>{"octet_1"=>"integer", "octet_2"=>"integer", "octet_3"=>"integer", "octet_4"=>"integer"}}|[str]pipeline:14:5:```
mutate {
        convert => {
            "octet_1" => "integer"
            "octet_2" => "integer"
            "octet_3" => "integer"
            "octet_4" => "integer"
        }
    }
```]
P[filter-mutate{"convert"=>{"octet_1"=>"integer", "octet_2"=>"integer", "octet_3"=>"integer", "octet_4"=>"integer"}}|[str]pipeline:14:5:```
mutate {
        convert => {
            "octet_1" => "integer"
            "octet_2" => "integer"
            "octet_3" => "integer"
            "octet_4" => "integer"
        }
    }
```] -> [if ((event.getField('[octet_1]')==0)||((event.getField('[octet_1]')==10)||((event.getField('[octet_1]')==127)||(((event.getField('[octet_1]')==100)&&((event.getField('[octet_2]')>=64)&&(event.getField('[octet_2]')<=127)))||(((event.getField('[octet_1]')==172)&&((event.getField('[octet_2]')>=16)&&(event.getField('[octet_2]')<=31)))||(((event.getField('[octet_1]')==192)&&(event.getField('[octet_2]')==168))||((event.getField('[octet_1]')==169)&&(event.getField('[octet_2]')==254))))))))]
[if ((event.getField('[octet_1]')==0)||((event.getField('[octet_1]')==10)||((event.getField('[octet_1]')==127)||(((event.getField('[octet_1]')==100)&&((event.getField('[octet_2]')>=64)&&(event.getField('[octet_2]')<=127)))||(((event.getField('[octet_1]')==172)&&((event.getField('[octet_2]')>=16)&&(event.getField('[octet_2]')<=31)))||(((event.getField('[octet_1]')==192)&&(event.getField('[octet_2]')==168))||((event.getField('[octet_1]')==169)&&(event.getField('[octet_2]')==254))))))))] -|true|-> P[filter-drop{}|[str]pipeline:34:9:```
drop { }
```]
[if ((event.getField('[octet_1]')==0)||((event.getField('[octet_1]')==10)||((event.getField('[octet_1]')==127)||(((event.getField('[octet_1]')==100)&&((event.getField('[octet_2]')>=64)&&(event.getField('[octet_2]')<=127)))||(((event.getField('[octet_1]')==172)&&((event.getField('[octet_2]')>=16)&&(event.getField('[octet_2]')<=31)))||(((event.getField('[octet_1]')==192)&&(event.getField('[octet_2]')==168))||((event.getField('[octet_1]')==169)&&(event.getField('[octet_2]')==254))))))))] -|false|-> P[output-stdout{"codec"=>"rubydebug"}|[str]pipeline:40:5:```
stdout { codec => rubydebug }
```]
P[filter-drop{}|[str]pipeline:34:9:```
drop { }
```] -> P[output-stdout{"codec"=>"rubydebug"}|[str]pipeline:40:5:```
stdout { codec => rubydebug }
```]
**GRAPH**

Having been through it, the reason it looks wrong is that it is explicitly including the implied short circuits used when evaluating conditionals.

The conditional

a or b or c or d

Requires one of the four to be true. If the first is true the system will not bother evaluating the rest. So you can represent that as

a or (b or (c or d))

which means it tests a and if that is true it does not bother evaluating the RHS '(b or (c or d))'. The more if statements you have the greater the number of close parentheses at the end.

Looking at the pseudo-code it produces

  1. if (
    
  2. (event.getField('[octet_1]')==0) ||
    
  3. (
    
  4.  (event.getField('[octet_1]')==10)||
    
  5.  (
    
  6.   (event.getField('[octet_1]')==127)||
    
  7.   (
    
  8.    (
    
  9.     (event.getField('[octet_1]')==100)&&
    
  10.      (
    
  11.       (event.getField('[octet_2]')>=64)&&(event.getField('[octet_2]')<=127)
    
  12.      )
    
  13.     )||
    
  14.     (
    
  15.      (
    
  16.       (event.getField('[octet_1]')==172)&&
    
  17.        (
    
  18.         (event.getField('[octet_2]')>=16)&&(event.getField('[octet_2]')<=31)
    
  19.        )
    
  20.       )||
    
  21.       (
    
  22.        (
    
  23.         (event.getField('[octet_1]')==192)&&(event.getField('[octet_2]')==168)
    
  24.        )||
    
  25.        (
    
  26.         (event.getField('[octet_1]')==169)&&(event.getField('[octet_2]')==254)
    
  27.        )
    
  28.       )
    
  29.      )
    
  30.     )
    
  31.    )
    
  32.   )
    
  33.  )
    

Line 2 evaluates false, so it has to evaluate 3 through 31. 4 is false, so it must evaluate 5 through 30. 6 is false, so it moves onto 7 to 29. 9 is false, so it moves onto 13. 16 is false, so it moves onto 20. 23 should evaluate true, so 21 through 28 evaluates true, so 7 through 29 evaluates true, so the whole thing evaluates true.

All the same, it's not a matter of logical operations in conditions. The condition in the text version is the same as in the compiled version. Аll logical operations are correct.
If true, then drop. But this does not happen with convert_datatype.

And yet the conversion does take place. I am unable to reproduce this.

It's funny.
Updated logstash to version 7.8.1. Did not help.
Updated openjdk to version "11.0.2". Didn't help either.
And then I tried this:

input {
    stdin { }
}

filter {
    #split IPv4 address into octets
    dissect {
        mapping => {
            "message" => "%{octet_1}.%{octet_2}.%{octet_3}.%{octet_4}"
        }
        convert_datatype => {
            "octet_1" => "int"
            "octet_2" => "int"
            "octet_3" => "int"
            "octet_4" => "int"
        }
    }

    if ([octet_1] != 0) {
        #add field 'TRUE' if 'octet_1' is not zero
        mutate {
            add_field => {
                "TRUE" => "Get True"
            }
        }

    } else {
        #add field 'FALSE' if 'octet_1' is zero
        mutate {
            add_field => {
                "FALSE" => "Get False"
            }
        }
    }
}

output {
    stdout { codec => rubydebug }
}

Result:

1.0.0.0
/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/awesome_print-1.7.0/lib/awesome_print/formatters/base_formatter.rb:31: warning: constant ::Fixnum is deprecated
{
          "TRUE" => "Get True",
      "@version" => "1",
    "@timestamp" => 2020-08-10T19:46:00.213Z,
          "host" => "elk",
       "octet_1" => 1,
       "octet_3" => 0,
       "message" => "1.0.0.0",
       "octet_4" => 0,
       "octet_2" => 0
}
0.0.0.0
{
          "TRUE" => "Get True",
      "@version" => "1",
    "@timestamp" => 2020-08-10T19:46:21.353Z,
          "host" => "elk",
       "octet_1" => 0,
       "octet_3" => 0,
       "message" => "0.0.0.0",
       "octet_4" => 0,
       "octet_2" => 0
}

Kak tebe takoe, Badger? :slight_smile:

That is very, very strange.

Taking a wild stab in the dark, can you disable java_execution?

Yeah! Disabled setting pipeline.java_execution in /etc/logstash/logstash.yml file. And this helped:

max@elk:/usr/share/logstash$ sudo bin/logstash -f conf.d/test_convert_datatype.conf --path.settings /etc/logstash
Sending Logstash logs to /var/log/logstash which is now configured via log4j2.properties
[2020-08-11T09:23:11,596][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2020-08-11T09:23:12,085][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"7.8.1"}
[2020-08-11T09:23:30,418][INFO ][org.reflections.Reflections] Reflections took 304 ms to scan 1 urls, producing 21 keys and 41 values
[2020-08-11T09:23:39,721][INFO ][logstash.pipeline        ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>1, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.sources"=>["/usr/share/logstash/conf.d/test_convert_datatype.conf"]}
[2020-08-11T09:23:40,725][INFO ][logstash.pipeline        ] Pipeline started successfully {:pipeline_id=>"main", :thread=>"#<Thread:0x249a764a run>"}
The stdin plugin is now waiting for input:
[2020-08-11T09:23:41,044][INFO ][logstash.agent           ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[2020-08-11T09:23:42,300][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}
1.0.0.0
/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/awesome_print-1.7.0/lib/awesome_print/formatters/base_formatter.rb:31: warning: constant ::Fixnum is deprecated
{
          "host" => "elk",
       "octet_1" => 1,
       "octet_2" => 0,
          "TRUE" => "Get True",
      "@version" => "1",
       "message" => "1.0.0.0",
    "@timestamp" => 2020-08-11T06:25:31.219Z,
       "octet_3" => 0,
       "octet_4" => 0
}
0.0.0.0
{
          "host" => "elk",
       "octet_1" => 0,
       "octet_2" => 0,
      "@version" => "1",
       "message" => "0.0.0.0",
    "@timestamp" => 2020-08-11T06:25:40.687Z,
       "octet_3" => 0,
       "octet_4" => 0,
         "FALSE" => "Get False"
}

What was that?

It is clearly a bug. You could open an issue on github, but I do not think anyone from elastic reviews new issues there, so it will probably never get addressed.

Hello Badger.
Thanks for the help. I read here https://www.elastic.co/blog/logstash-6-5-0-released that in version 6.5.0 java execution has been upgraded to beta. Didn't find up-to-date information for the current status. Just in case, I opened the issue on github https://github.com/logstash-plugins/logstash-filter-dissect/issues/73

java_execution became the default in 7.0. At some point the ruby engine will be removed. Not sure on the timeline for that.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.