This repository contains the source code for the PowerShell module "DatabricksPS". The module can also be found in the public PowerShell gallery: https://www.powershellgallery.com/packages/DatabricksPS/
It works for Databricks on Azure and also AWS. The APIs are almost identical so I decided to bundle them in one single module. The official API documentations can be found here:
Azure Databricks - https://docs.azuredatabricks.net/api/latest/index.html
Databricks on AWS - https://docs.databricks.com/api/latest/index.html
- Add support for Workspace configs (get/set)
- Add support for Global Init Scripts
- Add -Entitlements parameter to Add-DatabricksSCIMGroup
- Some fixes for proper pipelining when working with Groups and SCIM APIs
- Add test-case for Security (SCIM, Groups, memberships, ...)
- Fixed issue with Import of already existing files and folders
- Add support for Azure backed Secret Scopes for non-standard Azure environments like AzureChinaCloud or AzureUSGovernment
- Add support for AAD authentication in non-standard Azure environments like AzureChinaCloud or AzureUSGovernment
- Fix Secrets API when creating Azure KeyVault Backed Secret Scopes.
- Minor fix for Secrets API making -InitialManagePrincipal optional.
- Changed -ApiRootUrl parameter to support any URL and not just a fixed list.
- Added Get-DatabricksApiRootUrl cmdlet to be able to get a list of predefined API Root URLs
- Added new cmdlet Add-DatabricksClusterLocalLibrary to add a local library (.jar, .whl, ...) to a cluster with a single command
- Added Azure Active Directory (AAD) Authentication for Service Principals and Users
The easiest way to install the PowerShell module is to use the PowerShell built-in Install-Module cmdlet:
Install-Module -Name DatabricksPS
Alternatively you can also download this repository and copy the folder \Modules\DatabricksPS locally and install it from the local path, also using the Import-Module cmdlet:
Import-Module "C:\MyPSModules\Modules\DatabricksPS"
The module is designed to set the connection relevant properties once and they are used for all other cmdlets then. You can always update this information during your PS sessions to connect to different Databricks environments in the same session.
$accessToken = "dapi123456789e672c4007052d4694a7c51"
$apiUrl = "https://westeurope.azuredatabricks.net"
Set-DatabricksEnvironment -AccessToken $accessToken -ApiRootUrl $apiUrl
Once the environment is setup, you can use the other cmdlets:
Get-DatabricksWorkspaceItem -Path "/"
Export-DatabricksWorkspaceItem -Path "/TestNotebook1" -LocalPath "C:\TestNotebook1_Export.ipynb" -Format JUPYTER
Start-DatabricksJob -JobID 123 -NotebookParams @{myParameter = "test"}
Using pipelined cmdlets:
# stop all clusters
Get-DatabricksCluster | Stop-DatabricksCluster
# create multiple directories
"/test1","/test2" | Add-DatabricksWorkspaceDirectory
# get all run outputs for a given job
Get-DatabricksJobRun -JobID 123 | Get-DatabricksJobRunOutput
- Clusters API (Azure, AWS)
- Groups API (Azure, AWS)
- Jobs API (Azure, AWS)
- Secrets API (Azure, AWS)
- Token API (Azure, AWS)
- Workspace API (Azure, AWS)
- Libraries API (Azure, AWS)
- DBFS API (Azure, AWS)
- Instance Profiles API (AWS)
- SCIM API (Azure, AWS)
- Instance Pools API (Azure, AWS)
- Cluster Policies API (Azure, AWS)
- Instance Profiles API (AWS)
There are 3 ways to authenticate against the Databricks REST API of which 2 are unique to Azure:
- Personal Access token
- Azure Active Directory (AAD) Username/Password (Azure only!)
- Azure Active Directory (AAD) Service Principal (Azure only!)
This is the most straight forward authentication and works for both, Azure and AWS. The official documentation can be found here (Azure) or here (AWS) and is also persisted in this repository here.
$accessToken = "dapi123456789e672c4007052d4694a7c51"
$apiUrl = "https://westeurope.azuredatabricks.net"
Set-DatabricksEnvironment -AccessToken $accessToken -ApiRootUrl $apiUrl
This authentication method is very similar to what you use when logging in interactively when accessing the Databricks web UI. You provide the Databricks workspace you want to connect to, the username and a password. The official documentation can be found here and is also persisted in this repository here.
$credUser = Get-Credential
$tenantId = '93519689-1234-1234-1234-e4b9f59d1963'
$subscriptionId = '30373b46-5678-5678-5678-d5560532fc32'
$resourceGroupName = 'myResourceGroup'
$workspaceName = 'myDatabricksWorkspace'
$azureResourceId = "/subscriptions/$subscriptionId/resourceGroups/$resourceGroupName/providers/Microsoft.Databricks/workspaces/$workspaceName"
$clientId = 'db00e35e-1111-2222-3333-c8cc85e6f524'
$apiUrl = "https://westeurope.azuredatabricks.net"
Set-DatabricksEnvironment -ClientID $clientId -Credential $credUser -AzureResourceID $azureResourceId -TenantID $tenantId -ApiRootUrl $apiUrl
Service Principals are special accounts in Azure Active Directory which can be used for automated tasks like CI/CD pipelines. You provide the Databricks workspace you want to connect to, the ClientID and a ClientSecret/ClientKey. ClientID and ClientSecret need to be wrapped into a PSCredential where the ClientID is the usernamen and ClientSecret/ClientKey is the password. The rest is very similar to the Username/Password autehntication except that you also need to specify the -ServicePrincipal
flag. The official documentation can be found here and is also persisted in this repository here
$credSP = Get-Credential
$tenantId = '93519689-1234-1234-1234-e4b9f59d1963'
$subscriptionId = '30373b46-5678-5678-5678-d5560532fc32'
$resourceGroupName = 'myResourceGroup'
$workspaceName = 'myDatabricksWorkspace'
$azureResourceId = "/subscriptions/$subscriptionId/resourceGroups/$resourceGroupName/providers/Microsoft.Databricks/workspaces/$workspaceName"
$clientId = 'db00e35e-1111-2222-3333-c8cc85e6f524'
$apiUrl = "https://westeurope.azuredatabricks.net"
Set-DatabricksEnvironment -ClientID $clientId -Credential $credSP -AzureResourceID $azureResourceId -TenantID $tenantId -ApiRootUrl $apiUrl -ServicePrincipal