No results

    Connect to a Delta Lake instance

    In the premium version of Databricks, it is possible to control table access.  If that's the case, the user must be granted the following data object privileges:

    1. for reverse-engineering: READ_METADATA and SELECT  on the tables and views
    2. for forward-engineering (apply to instance): rights to CREATE/UPDATE schema, table, and view: CREATE on the CATALOG, and either OWN or both USAGE and CREATE on the schema, tables and views.


    Connecting a Databricks instance on one of the cloud providers requires to declare the connections settings and a personal access token.


    Delta Lake Databicks connection settings



    You should consult this page for more information on finding Databricks workspace details.


    Note: the connection must be established with compute clusters, i.e. not with the SQL Warehouse which uses a different API which we do not support.


    To get the cluster ID, click the Clusters tab in sidebar and then select a cluster name. The cluster ID is the number after the /clusters/ component in the URL of this page:



    In the following screenshot, the cluster ID is 1115-164516-often242:

    Delta Lake Databricks connection settings input



    Hackolade uses the Databricks REST API, and requires that an access token be issued in the Databricks console under User Settings > Generate New Token.  For more information, please consult this page. The access token must be pasted in the field below.  Access token management is described here.


    Delta Lake Databricks connections settings auth


    For Databricks on Azure, you may have tables using Azure Data Lake Storage which requires additional authentication for Databricks to access it.  In such case, you may need to declare ADLS credentials passthrough:

    Databricls ADLS passthrough credentials


    You may want to read more about ADLS credential passthrough.