After the warehouse: Orchestrating enterprise data pipelines across SAP Business Data Cloud
Just over a year ago, SAP introduced SAP Business Data Cloud (BDC) and its Databricks partnership and later in the year extended that with its Snowflake partnership, positioning SAP BDC as the next evolution of enterprise data management on SAP Business Technology Platform (BTP). The announcement — and the ecosystem behind it — were not incremental updates. They signaled a strategic shift in how SAP customers are expected to manage data, analytics and AI going forward.
This shift comes at a decisive moment, preceding SAP Business Warehouse (BW) reaching the end of mainstream maintenance in 2027, with extended maintenance ending in 2030. SAP BW/4HANA remains supported until at least 2040, but the long-term direction is clear. If you’re running SAP today, you’re likely moving from primarily on-premises, centralized data warehousing toward a cloud-based, multi-service data architecture.
That change is structural, and structural changes introduce new operational realities. As you modernize your data landscape as part of a broader SAP Cloud ERP or SAP Cloud ERP Private journey in GROW with SAP or RISE with SAP, the goal isn’t just architectural alignment. It’s to accelerate transformation while keeping operating costs predictable and avoiding new layers of technical debt.
What fundamentally changes with SAP Business Data Cloud
In a traditional SAP BW landscape, most data warehousing functions lived inside one system boundary. Data extraction, transformation, modeling, scheduling and reporting were tightly coupled. Even in complex SAP ERP environments, there was a central anchor point for enterprise data.
SAP BDC operates differently. Instead of one primary platform, you’re working across a set of tightly integrated services on SAP BTP. SAP Datasphere, SAP Analytics Cloud , SAP BW and BW/4HANA, Databricks and Snowflake form a broader data fabric.
SAP Datasphere, evolving from SAP Data Warehouse Cloud and incorporating capabilities from SAP Data Intelligence Cloud, is positioned as the core enterprise data management platform. It integrates with SAP Analytics Cloud for analytics and planning, and with Databricks and Snowflake for data pipelines, advanced analytics and AI scenarios.
From a data perspective, integration is stronger than ever. Semantics, metadata and access across SAP systems are more aligned than in previous generations.
But integration isn’t orchestration. As your landscape expands across these services, you still need a way to coordinate how jobs, dependencies and business processes execute across them.
Where orchestration becomes operationally critical
In SAP BDC environments, each component has its own scheduler and automation capabilities.
- SAP Datasphere runs replication flows and transformations
- Databricks executes machine learning pipelines
- Snowflake processes large-scale analytics workloads
- SAP Analytics Cloud refreshes dashboards and publishes stories
- SAP BW and BW/4HANA continue to run process chains
Individually, these systems work. The challenge appears when those jobs are part of a larger end-to-end business process.
Take a straightforward example. You run an extract, transform and load (ETL) or replication flow in SAP Datasphere. Once the data is updated and validated, you need to publish a new SAP Analytics Cloud story based on that refreshed dataset. Both steps can be scheduled locally. What connects them? What ensures the SAP Analytics Cloud publication only happens after the upstream process has completed successfully?
The same pattern applies if you’re using Databricks or Snowflake instead of SAP Datasphere. A machine learning or analytics job runs overnight. When it finishes, downstream reporting or operational updates need to be triggered. Each platform can manage its own workload, but the dependency between them isn’t governed unless you introduce orchestration across systems.
A second, equally common scenario is nightly batch processing across multiple services. You may schedule jobs independently inside SAP Datasphere, Databricks, Snowflake or SAP BW. Each executes reliably, but you don’t have a consolidated view of what’s happening across SAP BDC as a whole. There’s no single operational window into cross-platform execution, and understanding overall status may require reviewing several consoles.
That’s where orchestration extends the value of SAP BDC — by coordinating native schedulers and providing transparency across the ecosystem. It also reduces operational overhead. Instead of managing multiple schedulers, agents and custom scripts across environments, you establish a unified control layer that scales with your architecture. That’s particularly important in RISE with SAP environments with SAP Cloud ERP Private, where clean core principles discourage custom code inside the ERP and where unnecessary infrastructure adds cost and complexity.
The role of RunMyJobs in the SAP BDC era
RunMyJobs by Redwood provides that orchestration layer. It’s the only workload automation platform that’s both an SAP Endorsed App and included in the RISE with SAP reference architecture. RunMyJobs’ secure gateway connection to a customer’s RISE with SAP environment can be installed, hosted and managed by the SAP Enterprise Cloud Services team, eliminating the need for additional infrastructure and supporting clean core strategies from day one. Recognized as a Leader in the Gartner® Magic Quadrant™ for Service Orchestration and Automation Platforms, RunMyJobs centralizes scheduling, dependency management and monitoring across SAP and non-SAP systems.
For SAP BDC environments, RunMyJobs offers out-of-the-box connectors for:
Because RunMyJobs uses a secure gateway connection, very similar to how SAP Cloud Connector works, rather than requiring agents to be deployed across every SAP system, you avoid the operational costs and upgrade friction associated with agent-heavy architectures. That reduces maintenance effort, lowers total cost of ownership (TCO) and minimizes risk during SAP upgrades or RISE with SAP transformations.
In practice, you can:
- Trigger downstream analytics only after upstream data validation completes
- Coordinate nightly batch processes across multiple cloud services
- Establish a single pane of glass for visibility into SAP BDC execution
You don’t have to stop scheduling locally if that works for your teams, but by introducing an orchestration layer, you gain consistent control across the full landscape.
Supporting your path forward
There isn’t one correct response to the end of SAP BW mainstream maintenance. You may accelerate toward SAP Datasphere and a cloud-centric architecture. You may move selectively while continuing to run SAP BW/4HANA well into the next decade. Or, you may operate a hybrid model for years.
RunMyJobs supports all of the above, offering orchestration for classic SAP BW environments and all major components of SAP BDC. Whether you’re stabilizing existing SAP BW process chains or orchestrating new cloud-based workflows, the objective is the same: maintain control over execution across your environment.
You don’t have to complete a migration to benefit from orchestration. And you don’t have to abandon SAP BW to modernize your control layer. In fact, many organizations introduce orchestration early in their RISE with SAP and SAP Cloud ERP transformation to de-risk migration, retire legacy schedulers and create a scalable SaaS control tower before complexity compounds. That approach helps reduce disruption during go-live while positioning your automation strategy for long-term innovation.

A foundation for AI and advanced analytics
SAP BDC is also positioned as the foundation for enterprise AI and advanced analytics initiatives. Clean, harmonized data enables machine learning models and advanced analytics use cases.
But AI pipelines introduce additional operational dependencies. Training jobs, scoring runs, data refresh cycles and reporting updates must align across systems. As those chains grow, so does the need for consistent governance and monitoring. With RunMyJobs, the leading orchestration platform for the autonomous enterprise, you can apply consistent governance, monitoring and error handling across both traditional data warehousing processes and new, AI-driven workflows. That consistency is what turns experimentation into enterprise-grade transformation, without introducing new layers of manual oversight or operational costs.
See how RunMyJobs provides a coordination layer across SAP BTP, SAP BDC and your broader landscape:
Architect for control
As your SAP data landscape becomes more distributed across SAP BTP services, execution coordination becomes more important. Data integration continues to improve across SAP’s ecosystem. The next question is how you want those integrated systems to run together.
If you’re evaluating how to orchestrate SAP Datasphere, SAP Analytics Cloud, SAP BW, Databricks or Snowflake, particularly as part of a RISE with SAP and SAP Cloud ERP journey, the goal isn’t just coordination. It’s to modernize your execution layer in a way that supports clean core principles, reduces TCO and accelerates transformation across your enterprise.
The next step is practical: understand how orchestration connects to each of these platforms in your landscape.
Explore the full set of RunMyJobs SAP connectors and see how they extend SAP BTP and SAP BDC with enterprise-grade orchestration.
About The Author
Sven Kohlhaas
Sven Kohlhaas is Vice President – SAP Product Lead at Redwood Software. He is responsible for the global success and evolution of Redwood’s SAP-related product portfolio, helping organizations orchestrate complex business processes across their SAP and non-SAP systems. His vision is to drive operational excellence by empowering enterprises to maximize the value of their technology investments.
With almost 20 years of experience in the IT industry, most of which at SAP in high-impact product and engineering roles, Sven is a seasoned leader with unique subject-matter expertise. His background spans enterprise software, service orchestration and automation, SaaS and PaaS cloud platforms, GenAI and ERP systems. This deep technical foundation allows him to bridge the gap between legacy environments and next-generation cloud architectures.