- References:
- https://docs.microsoft.com/en-us/azure/databricks/data/databricks-file-system
- https://www.mssqltips.com/sqlservertip/6931/mount-azure-data-lake-storage-gen2-account-databricks/
- https://docs.microsoft.com/en-us/azure/databricks/data/data-sources/azure/azure-storage
- Example to mount:
dbutils.fs.mount(
source = "wasbs://<container-name>@<storage-account-name>.blob.core.windows.net",
mount_point = "/mnt/<mount-name>",
extra_configs = {"<conf-key>":dbutils.secrets.get(scope = "<scope-name>", key = "<key-name>")})
spark.conf.set(
"fs.azure.sas.<container-name>.<storage-account-name>.blob.core.windows.net",
"<complete-query-string-of-sas-for-the-container>")
- Example to unmount:
dbutils.fs.unmount("/mnt/<mount-name>")
- Clear Python memory:
import gc
gc.collect()
- Clear Cluster cache:
sqlContext.clearCache()