About 6,010,000 results
Open links in new tab
  1. Printing secret value in Databricks - Stack Overflow

    Nov 11, 2021 · First, install the Databricks Python SDK and configure authentication per the docs here. pip install databricks-sdk Then you can use the approach below to print out secret …

  2. Databricks: How do I get path of current notebook?

    Nov 29, 2018 · The issue is that Databricks does not have integration with VSTS. A workaround is to download the notebook locally using the CLI and then use git locally. I would, however, …

  3. How to deploy a Streamlit app using Databricks App

    Oct 12, 2024 · Here is how you create a app in databricks. Click on New plus button and select App. Next, for built-in template you select Streamlit and type of app. you can also select …

  4. Databricks - Download a dbfs:/FileStore file to my Local Machine

    In a Spark cluster you access DBFS objects using Databricks file system utilities, Spark APIs, or local file APIs. On a local computer you access DBFS objects using the Databricks CLI or …

  5. Clearing Cached Data on Databricks Cluster - Stack Overflow

    Jan 17, 2025 · spark.conf.set("spark.databricks.io.cache.enabled", "false") I understand this only disables the IO cache and while it does lead to some reduction in the cached memory, a …

  6. REST API to query Databricks table - Stack Overflow

    Jul 24, 2022 · Update: April 2023rd. There is a new SQL Execution API for querying Databricks SQL tables via REST API. It's possible to use Databricks for that, although it heavily …

  7. Is there a way to use parameters in Databricks in SQL with …

    Sep 29, 2024 · I got a message from Databricks' employee that currently (DBR 15.4 LTS) the parameter marker syntax is not supported in this scenario. It might work in the future versions. …

  8. Databricks: managed tables vs. external tables - Stack Overflow

    Jun 21, 2024 · While Databricks manages the metadata for external tables, the actual data remains in the specified external location, providing flexibility and control over the data storage …

  9. databricks - This request is not authorized to perform this …

    and it solved my problem. Now i have access from databricks to the mounted containers. Here is how to give permissions to the service-principal-app: Open storage account; Open IAM; Click …

  10. Installing multiple libraries 'permanently' on Databricks' cluster ...

    Feb 28, 2024 · Easiest is to use databricks cli's libraries command for an existing cluster (or create job command and specify appropriate params for your job cluster) Can use the REST …

Refresh