You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
For data-heavy notebooks with long-running cells and slow top-to-bottom runtimes, it's really useful in the databricks notebooks web UI to be able to close and reopen the browser tab, browser app, or computer and be able to come back later to all of the existing kernel state just as you left it.
It'd be really useful to enable this in Databricks Power Tools notebooks too. Something like:
Store the active (clusterId, contextId) per notebook editor (from /contexts/create)
On vscode restart/reload, notebook editor reopen, etc., load any stored (clusterId, contextId) and reuse them if that context and cluster is still alive (from /contexts/status)
Rely on the databricks cluster to automatically terminate idle execution contexts (enabled by default):
For data-heavy notebooks with long-running cells and slow top-to-bottom runtimes, it's really useful in the databricks notebooks web UI to be able to close and reopen the browser tab, browser app, or computer and be able to come back later to all of the existing kernel state just as you left it.
It'd be really useful to enable this in Databricks Power Tools notebooks too. Something like:
(clusterId, contextId)
per notebook editor (from/contexts/create
)(clusterId, contextId)
and reuse them if that context and cluster is still alive (from/contexts/status
)The text was updated successfully, but these errors were encountered: