JDBC Input with Aggregate Filter

Hello,

I have a plugin that gets data from SQL server and aggregates some records with a unique ID, But unfortaintly I didn't get any output.

My config file is as bellow.

input {
  jdbc {
    jdbc_connection_string => "my connection String"
    jdbc_driver_class => "com.microsoft.sqlserver.jdbc.SQLServerDriver"
    jdbc_user => "XXX"
    statement => "select  partID, featureName, name FROM dbo.Parametric WHERE partid =16942660 or partid =  7406888 ORDER BY PartID"
  }
  
  
}

filter {
  aggregate {
    task_id => "%{partID}"
    code => "
  map['featureNames'] ||= []
  map['featureNames']<<{ 'featureName' => event.get('featureName') , 'name' => event.get('name')}  "
    push_map_as_event_on_timeout => true
    timeout_tags => ['aggregate']
  }

  if "featureNames" not in [tags] {
    drop {}
  }
}

output {

  stdout {
		codec => rubydebug
	}
}

When run It with " bin\logstash -f myconfig.conf -w 1" Logstash runs successfully but no output.

If I removed if

"featureNames" not in [tags] {
            drop {}
          }

I get the output without any aggregates like this

{
    "featurename" => "Failure Rate",
       "@version" => "1",
         "partid" => 7406888,
           "name" => "N/A",
     "@timestamp" => 2019-10-29T09:01:51.654Z
}
{
    "featurename" => "Minimum Operating Temperature
       "@version" => "1",
         "partid" => 7406888,
           "name" => "-55°C",
     "@timestamp" => 2019-10-29T09:01:51.654Z
}
{
    "featurename" => "Features",
       "@version" => "1",
         "partid" => 7406888,
           "name" => "High Reliability",
     "@timestamp" => 2019-10-29T09:01:51.656Z
}
{
    "featurename" => "Type",
       "@version" => "1",
         "partid" => 7406888,
           "name" => "Molded",
     "@timestamp" => 2019-10-29T09:01:51.724Z
}
{
    "featurename" => "Maximum Operating Temperature
       "@version" => "1",
         "partid" => 7406888,
           "name" => "125°C",
     "@timestamp" => 2019-10-29T09:01:51.725Z
}
{
    "featurename" => "Lifetime @ Temp.",
       "@version" => "1",
         "partid" => 7406888,
           "name" => "N/A",
     "@timestamp" => 2019-10-29T09:01:51.726Z
}
{
    "featurename" => "CCC/CQC",
       "@version" => "1",
         "partid" => 16942660,
           "name" => "No",
     "@timestamp" => 2019-10-29T09:01:51.727Z
}
{
    "featurename" => "Contact Finish",
       "@version" => "1",
         "partid" => 16942660,
           "name" => "Tin",
     "@timestamp" => 2019-10-29T09:01:51.727Z
}
{
    "featurename" => "Wire Gauge",
       "@version" => "1",
         "partid" => 16942660,
           "name" => "24 AWG",
     "@timestamp" => 2019-10-29T09:01:51.728Z
}
{
    "featurename" => "Contact Finish Thickness",
       "@version" => "1",
         "partid" => 16942660,
           "name" => "2.54µm",
     "@timestamp" => 2019-10-29T09:01:51.728Z
}
{
    "featurename" => "Contact End",
       "@version" => "1",
         "partid" => 16942660,
           "name" => "Socket to Socket",
     "@timestamp" => 2019-10-29T09:01:51.731Z
}
{
    "featurename" => "Length",
       "@version" => "1",
         "partid" => 16942660,
           "name" => "254mm",
     "@timestamp" => 2019-10-29T09:01:51.731Z
}

Can anyone tell me what's wrong with my configration ?

I also make sure that pipeline.workers is set to one
and used bin\logstash -f myconfig.conf, but the same results

The default value for the timeout option on an aggregate filter is 1800 seconds. Are you waiting half an hour? If not, set the timeout option to something smaller.

Hello Badger, thanks for your help, but unfortaintly I tried to set the time out to 3, 5 any values and the same result, the problem is logstash start the pipleline successfully and then shutdown after 1 second.

   [2019-10-30T09:35:46,089][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}
[2019-10-30T09:35:48,519][INFO ][logstash.inputs.jdbc     ][main] (0.255277s) select  partid, featurename, name FROM dbo.Parametric WHERE partid =16942660 or partid =  7406888 ORDER BY PartID
[2019-10-30T09:35:49,386][INFO ][logstash.runner          ] Logstash shut down.

If you do not set the schedule option then once a jdbc input has run the query there is nothing more for it to do, so I think it shuts itself down, which causes logstash to exit.

Thanks so much Badger It's working now, but can you tell me how to remove the index brackets in the output [0], [1], .....

{
      "@timestamp" => 2019-10-30T14:14:09.204Z,
        "@version" => "1",
    "featureNames" => [
        [ 0] {
            "featurename" => "ESR Type",
                   "name" => "Standard"
        },
        [ 1] {
            "featurename" => "Ratings",
                   "name" => "N/A"
        },
        [ 2] {
            "featurename" => "Tolerance",
                   "name" => "±20%"
        },
        [ 3] {
            "featurename" => "Capacitance",
                   "name" => "47µF"
        },
        [ 4] {
            "featurename" => "Voltage Rating",
                   "name" => "10V"
        },
        [ 5] {
            "featurename" => "Weight",
                   "name" => "N/A"
        },
        [ 6] {
            "featurename" => "ESR",
                   "name" => "900mOhm @ 120Hz"
        },
        [ 7] {
            "featurename" => "Failure Rate",
                   "name" => "N/A"
        },
        [ 8] {
            "featurename" => "Minimum Operating Temperature",
                   "name" => "-55°C"
        },

That is how rubydebug formats an array. If you want another format, such as JSON, then use a different codec on the output.

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.