Description
The databricks-sdk fails to authenticate when running inside a local Jupyter Notebook (.ipynb) in VS Code, even though the exact same code and configuration profile work perfectly when executed as a standard Python script (.py). The Notebook environment seems unable to pick up the default credentials or session-based environment variables.
Reproduction
File: main.py (Works)
Python
from databricks.sdk import WorkspaceClient
w = WorkspaceClient()
print(p.name for p in `w.pipelines.list_pipelines())`
File: notebook.ipynb (Fails)
# Cell 1
from databricks.sdk import WorkspaceClient
w = WorkspaceClient() # Raises ValueError: cannot configure default credentials
Expected behavior
The WorkspaceClient() should resolve credentials from the ~/.databrickscfg file and active environment variables consistently across both .py scripts and .ipynb kernels.
Is it a regression?
Unknown. Tested on Python 3.13
Debug Logs
ValueError: default auth: cannot configure default credentials, please check https://docs.databricks.com/en/dev-tools/auth.html#databricks-client-unified-authentication
Config: host=https://####, account_id=&&&, workspace_id=***, discovery_url=https:####/oidc/.well-known/oauth-authorization-server, auth_type=databricks-cli
Other Information
- OS: Windows 11
- Python Version: 3.13.0
- SDK Version: 0.102.0
Additional context
The .databrickscfg file is properly configured with a [DEFAULT] profile using auth_type=databricks-cli.
The issue persists in the Notebook even after running databricks auth login via the integrated terminal and restarting the Jupyter kernel.
Environment variables like DATABRICKS_CONFIG_PROFILE set in the VS Code terminal are visible to the .py script but appear to be ignored or not inherited by the Jupyter Kernel.
Description
The databricks-sdk fails to authenticate when running inside a local Jupyter Notebook (.ipynb) in VS Code, even though the exact same code and configuration profile work perfectly when executed as a standard Python script (.py). The Notebook environment seems unable to pick up the default credentials or session-based environment variables.
Reproduction
File: main.py (Works)
Python
File: notebook.ipynb (Fails)
Expected behavior
The WorkspaceClient() should resolve credentials from the ~/.databrickscfg file and active environment variables consistently across both .py scripts and .ipynb kernels.
Is it a regression?
Unknown. Tested on Python 3.13
Debug Logs
ValueError: default auth: cannot configure default credentials, please check https://docs.databricks.com/en/dev-tools/auth.html#databricks-client-unified-authentication
Config: host=https://####, account_id=&&&, workspace_id=***, discovery_url=https:####/oidc/.well-known/oauth-authorization-server, auth_type=databricks-cli
Other Information
Additional context
The .databrickscfg file is properly configured with a [DEFAULT] profile using auth_type=databricks-cli.
The issue persists in the Notebook even after running databricks auth login via the integrated terminal and restarting the Jupyter kernel.
Environment variables like DATABRICKS_CONFIG_PROFILE set in the VS Code terminal are visible to the .py script but appear to be ignored or not inherited by the Jupyter Kernel.