The page you navigated to does not exist, so we brought you to the closest page to it.
flytekitplugins.spark.connector
flytekitplugins.spark.connector
Property
Type
Description
DATABRICKS_API_ENDPOINT
str
DEFAULT_DATABRICKS_INSTANCE_ENV_KEY
str
FLYTE_FAIL_ON_ERROR
str
def result_state_is_available (
life_cycle_state : str ,
) -> bool
Parameter
Type
Description
life_cycle_state
str
def DatabricksConnector ()
Property
Type
Description
metadata_type
None
task_category
None
task category that the connector supports
Method
Description
create()
Return a resource meta that can be used to get the status of the task.
delete()
Delete the task.
get()
Return the status of the task, and return the outputs in some cases.
get_logs()
Return the metrics for the task.
get_metrics()
Return the metrics for the task.
def create (
task_template : flytekit . models . task . TaskTemplate ,
inputs : typing . Optional [ flytekit . models . literals . LiteralMap ],
kwargs ,
) -> flytekitplugins . spark . connector . DatabricksJobMetadata
Return a resource meta that can be used to get the status of the task.
Parameter
Type
Description
task_template
flytekit.models.task.TaskTemplate
inputs
typing.Optional[flytekit.models.literals.LiteralMap]
kwargs
**kwargs
def delete (
resource_meta : flytekitplugins . spark . connector . DatabricksJobMetadata ,
kwargs ,
)
Delete the task. This call should be idempotent. It should raise an error if fails to delete the task.
Parameter
Type
Description
resource_meta
flytekitplugins.spark.connector.DatabricksJobMetadata
kwargs
**kwargs
def get (
resource_meta : flytekitplugins . spark . connector . DatabricksJobMetadata ,
kwargs ,
) -> flytekit . extend . backend . base_connector . Resource
Return the status of the task, and return the outputs in some cases. For example, bigquery job
can’t write the structured dataset to the output location, so it returns the output literals to the propeller,
and the propeller will write the structured dataset to the blob store.
Parameter
Type
Description
resource_meta
flytekitplugins.spark.connector.DatabricksJobMetadata
kwargs
**kwargs
def get_logs (
resource_meta : flytekit . extend . backend . base_connector . ResourceMeta ,
kwargs ,
) -> flyteidl . admin . agent_pb2 . GetTaskLogsResponse
Return the metrics for the task.
Parameter
Type
Description
resource_meta
flytekit.extend.backend.base_connector.ResourceMeta
kwargs
**kwargs
def get_metrics (
resource_meta : flytekit . extend . backend . base_connector . ResourceMeta ,
kwargs ,
) -> flyteidl . admin . agent_pb2 . GetTaskMetricsResponse
Return the metrics for the task.
Parameter
Type
Description
resource_meta
flytekit.extend.backend.base_connector.ResourceMeta
kwargs
**kwargs
Add DatabricksConnectorV2 to support running the k8s spark and databricks spark together in the same workflow.
This is necessary because one task type can only be handled by a single backend plugin.
spark -> k8s spark plugin
databricks -> databricks connector
def DatabricksConnectorV2 ()
Property
Type
Description
metadata_type
None
task_category
None
task category that the connector supports
Method
Description
create()
Return a resource meta that can be used to get the status of the task.
delete()
Delete the task.
get()
Return the status of the task, and return the outputs in some cases.
get_logs()
Return the metrics for the task.
get_metrics()
Return the metrics for the task.
def create (
task_template : flytekit . models . task . TaskTemplate ,
inputs : typing . Optional [ flytekit . models . literals . LiteralMap ],
kwargs ,
) -> flytekitplugins . spark . connector . DatabricksJobMetadata
Return a resource meta that can be used to get the status of the task.
Parameter
Type
Description
task_template
flytekit.models.task.TaskTemplate
inputs
typing.Optional[flytekit.models.literals.LiteralMap]
kwargs
**kwargs
def delete (
resource_meta : flytekitplugins . spark . connector . DatabricksJobMetadata ,
kwargs ,
)
Delete the task. This call should be idempotent. It should raise an error if fails to delete the task.
Parameter
Type
Description
resource_meta
flytekitplugins.spark.connector.DatabricksJobMetadata
kwargs
**kwargs
def get (
resource_meta : flytekitplugins . spark . connector . DatabricksJobMetadata ,
kwargs ,
) -> flytekit . extend . backend . base_connector . Resource
Return the status of the task, and return the outputs in some cases. For example, bigquery job
can’t write the structured dataset to the output location, so it returns the output literals to the propeller,
and the propeller will write the structured dataset to the blob store.
Parameter
Type
Description
resource_meta
flytekitplugins.spark.connector.DatabricksJobMetadata
kwargs
**kwargs
def get_logs (
resource_meta : flytekit . extend . backend . base_connector . ResourceMeta ,
kwargs ,
) -> flyteidl . admin . agent_pb2 . GetTaskLogsResponse
Return the metrics for the task.
Parameter
Type
Description
resource_meta
flytekit.extend.backend.base_connector.ResourceMeta
kwargs
**kwargs
def get_metrics (
resource_meta : flytekit . extend . backend . base_connector . ResourceMeta ,
kwargs ,
) -> flyteidl . admin . agent_pb2 . GetTaskMetricsResponse
Return the metrics for the task.
Parameter
Type
Description
resource_meta
flytekit.extend.backend.base_connector.ResourceMeta
kwargs
**kwargs
class DatabricksJobMetadata (
databricks_instance : str ,
run_id : str ,
)
Parameter
Type
Description
databricks_instance
str
run_id
str
Method
Description
decode()
Decode the resource meta from bytes.
encode()
Encode the resource meta to bytes.
def decode (
data : bytes ,
) -> ResourceMeta
Decode the resource meta from bytes.
Parameter
Type
Description
data
bytes
Encode the resource meta to bytes.