SOLVED - Issue converting JSON after JDBC Streaming

Hi Team,

I have a logstash config file that takes input source of SQS. I then take this and apply a filter to convert the JSON message from SQS into a data structure (this is working fine)

I then apply another filter with JDBC streaming to enrich the SQS message with additional information from an external source. I want to take the results of that (which appears to be returning as a json and add to the data structure. However, it appears to be throwing an exception:

[2017-06-26T16:51:19,710][WARN ][logstash.filters.json ] Error parsing json {:source=>"aircraft", :raw=>[{"owner_airline_icao"=>"SKW", "seat_count"=>76, "aircraft_type"=>"ERJ 175", "partner_airline_icao"=>"ASA", "normalized_registration_number"=>"N181SY", "transaction_routing_id"=>"L1"}], :exception=>java.lang.ClassCastException: org.jruby.RubyArray cannot be cast to org.jruby.RubyIO}

My config file is below. This is on Logstash 5.3.0 going to ES Hosted Cloud 5.3.0

input{
	sqs{
		queue => "flight-info"
		threads =>5
	}
}

filter{
	json{
		source => "Message"
	}
	mutate{
		remove_field => ["Message","TopicArn","UnsubscribeURL","SignatureVersion","Type","Signature","SigningCertURL"]
	}
	if [aircraft_id] == "0" {
    drop { }
    }
}

filter {
	jdbc_streaming {
		jdbc_driver_library => "/opt/logstash-5.3.0/bin/mysql-connector-java-5.1.42-bin.jar"
		jdbc_driver_class => "com.mysql.jdbc.Driver"
    	jdbc_connection_string => "jdbc:mysql://url/db"
    	jdbc_user => "user"
    	jdbc_password => "password"
    	statement => "select normalized_registration_number, aircraft_type, transaction_routing_id, owner_airline_icao, partner_airline_icao, seat_count from fig_dw.aircraft where id = :aircraft"
    	parameters => { "aircraft" => "aircraft_id"}
    	target => "aircraft"
	}
}

filter{
	json{
		source => "aircraft"
	}
	mutate{
		remove_field => ["aircraft"]
	}
}

 

output {

	elasticsearch {
	hosts => "https://cluster.us-east-1.aws.found.io:9243"
	index => "myindex"
	user => "elastic"
	password => "mypassword"
	document_type => "mydoc"
	}
}

Any help greatly appreciated.

Thanks
Wayne

What, exactly, does aircraft look like after the jdbc_streaming filter? By the look of things it seems it already is an array.

@magnusbaeck, the field looks like:

fig2dw_aircraft	   	  {
  "owner_airline_icao": "AAL",
  "seat_count": 160,
  "aircraft_type": "B737-823",
  "partner_airline_icao": "AAL",
  "normalized_registration_number": "N858NN",
  "transaction_routing_id": "A1"
}

I want to break that into the individual elements which e.g. separate fields vs one big JSON object for that field.

That's not valid JSON so I'm confused. Please comment out your json and mutate filters and show an example event from ES. Copy/paste from the JSON tab in Kibana's Discover panel or use a stdout { codec => rubydebug } output.

Hi @magnusbaeck thanks for helping.

As requested:

{
      "_index": "myindes",
      "_type": "mytype",
      "_id": "AVzmV-ovFRUeD1qZhGKI",
      "_score": null,
      "_source": {
        "updated_time": "1498513140000",
        "registration_number": "N910XJ",
        "fig2dw_aircraft": [
          {
            "owner_airline_icao": "FLG",
            "seat_count": 76,
            "aircraft_type": "CL-600-2D24",
            "partner_airline_icao": "DAL",
            "normalized_registration_number": "N910XJ",
            "transaction_routing_id": "D1"
          }
        ],
        "flight_identifier": "FLG4114",
        "Timestamp": "2017-06-26T21:39:00.187Z",
        "aircraft_id": "4727",
        "@timestamp": "2017-06-26T21:39:00.261Z",
        "@version": "1",
        "id": "12669696",
        "flight_state": "TOUCHDOWN",
        "MessageId": "a0a68ae8-0bfc-5c7e-8831-db238ac8b68a"
      },
      "fields": {
        "@timestamp": [
          1498513140261
        ],
        "Timestamp": [
          1498513140187
        ]
      },
      "sort": [
        1498513140187
      ]
}

I'm confused. Where's the aircraft field you're talking about?

Is it the fig2dw_aircraft field you somehow want to transform or process? How?

Hi magnusbaeck, yes that's the field. Sorry for any confusion

I want to take that field like I took the was message and split the fields just like I did for the was sqs message so that each field is separate as part of the document and I can filter and search and visualize later

SOLVED - thanks @magnusbaeck for nudging me to realize this wasn't a JSON to parse.
Split works perfectly fine.

I just need to pretty up and rename some fields following it but worked perfectly in our sandbox environment after the change.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.