Importing MySQL column into Elasticsearch as nested field with jdbc plugin

Hi, I am new to elasticsearch and have spent the last two weeks trying to learn everything i can to the best of my ability but as one can imagine it is somewhat overwhelming from a new-coming standpoint.

I am trying to import MySQL data as suggestions, This currently works fine however the problem comes when i attempt to add payload data such as a userId field, My initial thoughts was to perform the following query.

SELECT user_id as 'suggestion.payloads.userId' FROM MyTable WHERE MyColumn = MyValue;

After doing some digging through the logs to see what the issue was, I get the following error message:

MapperParsingException[Field name [suggestion.payloads.userId] cannot contain '.']

Would anyone be able to lend a helping hand to overcome this issue, Any help or advice would be hugely appreciated, I have been trying all day / night to resolve this but it seems im missing something and due to my lack of knowledge.

Thank you.

Did you try SELECT user_id as '[suggestion][payloads][userId]' ...? If that doesn't work I suspect you have to select fields with flat names and rename them afterwards.

More on field reference syntax: https://www.elastic.co/guide/en/logstash/current/event-dependent-configuration.html#logstash-config-field-references

@magnusbaeck - Thanks for taking the time out to lend a helping hand. Unfortunately that did not work either and is indexed in elasticsearch like so:

      "@timestamp": "2017-08-08T10:31:06.871Z",
      "@version": "1",
      "[suggestion][payloads][userId]": 1 

I have also performed a suggestion query to double check the field definitely wasn't part of the payload and it confirmed it did not work.

After reading the url provided, I am trying to put a config file together and will do some testing. Would you say this is along the right lines or am i getting it all worng because when ever i run the config file below i get the following error message

"exception"=>"expecting List or Map, found class org.logstash.bivalues.StringBiValue

input {
jdbc {
	jdbc_driver_library => "/home/logstash/mysql-connector-java-5.1.42-bin.jar"
	jdbc_driver_class => "com.mysql.jdbc.Driver"
	jdbc_connection_string => "jdbc:mysql://localhost:3306"
	jdbc_user => "username"
	jdbc_password => "password"
	statement => "SELECT userId FROM users WHERE id <= 10"
	jdbc_paging_enabled => "true"
	jdbc_page_size => "50000"
}}

filter {
  mutate { add_field => { "[suggestion][payloads][userId]" => "%{[userId]}" } }
}

output {
	elasticsearch {
		hosts => [
			"localhost:9200"
		]
		index => "users"
		document_type => "user"
	}
}

Once again, Thanks for you time i really appreciate it.

This looks correct, but the best way to rename a field is to use the mutate filter's rename option.

@magnusbaeck - Would you be willing to help me out with an example of how i can go about converting the userid field to a nestable field that logstash to enable logstash to put the given field into suggestion.payloads

I have been trying the last 48 hours and have had no luck what so ever, I've literally hit stale mate with the project until i can overcome the issue i have with being able to index the data with the required fields.

I tried using the rename method but im certain i am doing it wrong.

filter {
    mutate {
        rename => {"[suggestion][payloads][userid]" => "%{[userid]}"}
    }
}

I would be greatly appreciative if you could help me overcome the issue and possibly provide an example of how you would have your config file to achieve the desired result when using an test sql statement like so:

select userid from mytable where myvalue is <= 10

How do i then get the userid value's to the correct format so elasticsearch will know that im sending this as a suggest.payloads.userid field

Thanks magnus it will be a huge help.

I tried using the rename method but im certain i am doing it wrong.

Yes, you're trying to rename [suggestion][payloads][userid] to userid instead of the other way around.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.