Serverless compute for workflows enables you to run your Databricks jobs without the need for configuring and deploying infrastructure. This allows you to focus solely on implementing your data processing and analysis pipelines. Databricks takes care of managing compute resources, including optimizing and scaling them for your workloads. With autoscaling and Photon automatically enabled, you can be assured of efficient resource utilization.
Additionally, serverless compute for workflows features auto-optimization, which selects the appropriate resources such as instance types, memory, and processing engines based on your workload. It also automatically retries failed jobs, ensuring smooth and efficient execution of your data workflows.
On 15.7.2024, [Serverless Compute for Notebooks, Workflows, and Delta Live Tables went into GA](https://www.databricks.com/blog/announcing-general-availability-serverless-compute-notebo