Connect to a Delta Lake instance
- for reverse-engineering: READ_METADATA and SELECT on the tables and views
- for forward-engineering (apply to instance): rights to CREATE/UPDATE schema, table, and view: CREATE on the CATALOG, and either OWN or both USAGE and CREATE on the schema, tables and views.
Connecting a Databricks instance on one of the cloud providers requires to declare the connections settings and a personal access token.
You should consult this page for more information on finding Databricks workspace details.
Note: the connection must be established with compute clusters, i.e. not with the SQL Warehouse which uses a different API which we do not support.
To get the cluster ID, click the Clusters tab in sidebar and then select a cluster name. The cluster ID is the number after the /clusters/ component in the URL of this page:
In the following screenshot, the cluster ID is 1115-164516-often242:
Hackolade uses the Databricks REST API, and requires that an access token be issued in the Databricks console under User Settings > Generate New Token. For more information, please consult this page. The access token must be pasted in the field below. Access token management is described here.
For Databricks on Azure, you may have tables using Azure Data Lake Storage which requires additional authentication for Databricks to access it. In such case, you may need to declare ADLS credentials passthrough:
You may want to read more about ADLS credential passthrough.