Scripted fields in pyspark

Hey,
Is it possible to use scripted fields using pyspark?
if so how can I use it?

and if not - how can I query field and convert the field from float to integer in the query itself?

thanks

I have not tried it (yet), but is it possible to just pass it in as your query using the es.query configuration like the example at Apache Spark support | Elasticsearch for Apache Hadoop [8.6] | Elastic? There is more information about es.query at Configuration | Elasticsearch for Apache Hadoop [8.6] | Elastic.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.