Databricks connector
Flyte can be integrated with the Databricks service, enabling you to submit Spark jobs to the Databricks platform.
Installation
The Databricks connector comes bundled with the Spark plugin. To install the Spark plugin, run the following command:
$ pip install flytekitplugins-spark
Example usage
For a usage example, see Databricks connector example usage.
Local testing
To test the Databricks connector locally, create a class for the connector task that inherits from AsyncConnectorExecutorMixin. This mixin can handle asynchronous tasks and allows flytekit to mimic FlytePropeller’s behavior in calling the connector.
In some cases, you will need to store credentials in your local environment when testing locally.
Flyte deployment configuration
If you are using a managed deployment of Flyte, you will need to contact your deployment administrator to configure connectors in your deployment.
To enable the Databricks connector in your Flyte deployment, see the Databricks connector setup guide.