With SAP’s annual TechEd in sight and the recent slew of updates on SAP Business Data Cloud (BDC), this is as good a moment as any to take stock on where the solution stands today. New and existing SAP (Datasphere) customers who consider BDC continue to have questions on what is possible with SAP’s next generation data and analytics suite, so let me try and answer some of these in this blog post.
The latest: BDC Connect to Google BigQuery and Databricks
On October 6th, SAP and Google announced the SAP Business Data Cloud Connect for Google BigQuery. This is a big deal, given that the zero-copy mechanism implemented here was first announced in conjunction with BDC itself, but at the time exclusively with Databricks. The BDC Connect for Google (in which Google Cloud is also the first hyperscaler to support BDC Connect) is bi-directional, meaning that customers can carry over semantically rich data from their SAP ecosystem to and from BigQuery. SAP and Google are quick to mention the use cases for AI and autonomous agents here, but I am a little skeptical on the efficiency of zero-copy (or virtualization/federation) when it comes to the volumes of data required for these applications. Nevertheless, this is good news for customers who (also) have Google solutions as part of their data landscape.
On the same day that the SAP/Google announcement was made, Databricks also presented the general availability of the SAP Business Data Cloud Connect for Databricks. Offering the same, bi-directional zero-copy sharing of data(products) (specifically via Delta Sharing in the case of Databricks), the BDC Connect for Databricks is especially useful for customers that are considering BDC (or Datasphere), but already have a separate Databricks instance in their landscape. For these customers, BDC Connect offers a relatively simple and secure (mTLS and OAuth) method to get SAP business data to Databricks without losing SAP’s business semantics. One question I still have here is whether BDC Connect works across hyperscalers, or if it has the same limitation to the same Cloud similar to the SAP BDC connector for Databricks (the latter of which is currently only available for AWS and GCP) in terms of either availability and/or concurrent network costs.
With regards to BDC Connect towards Databricks, my colleague Dick Groenhof also mentioned his concern on the (probable) lack of storage optimization of data on HDLFS (Apache Parquet files) when using Delta Sharing. In stand-alone Databricks, this process can be optimized (e.g. with liquid clustering), but it is unclear whether or not this will work with the data stored in the BDC Object Store, as it is read-only. This in turn could cause performance issues with large data volumes that should (ideally) be partitioned. We will report on our future findings at Expertum and check back in when there is more clarity on these topics.
Integrating with relevant solutions
Arguably the most interesting integration option for BDC is of course SAP Databricks. SAP Databricks will have to be onboarded in the Business Data Cloud formation inside the customer’s tenant (via the SAP For Me-account), before it can be used. After this has been done, Data Products can be shared to SAP Databricks from BDC’s Foundation Services. In the case of Custom Data Products; these can be build based on local data (which currently has to come from the HDLFS Space in Datasphere) and can then be shared with SAP Databricks via Delta Sharing. Finally, customers can then use either SQL or Python Notebooks to work with the data in SAP Databricks.
When it comes to S/4HANA Private Cloud, customers should note that the SAP Cloud Connector will be required to integrate it with BDC. Moreover, there are limitations to the ABAP integrations between the two systems. In BDC’s current release (early Q4 2025), S/4HANA Private Cloud is the only LoB solution that is supported within a BDC Formation; S/4HANA Public Cloud and SuccessFactors will follow later this year (regarding S/4HANA Public Cloud, SAP stated it hit some technical hiccups, but this does not impact their advice to go for BDC as opposed to a standalone Datasphere solution). More information can be found here: https://me.sap.com/notes/3500131.
Another question that more technically ambitious customers might have, is the integration of the HANA Knowledge Graph into BDC. SAP has stated that this is on the roadmap for 2026 and will allow BDC data to be analyzed as part of graph datasets or in graph analytics.
Finally, customers who do not yet have S/4HANA and are still on SAP ECC but are contemplating a connection to BDC/Datasphere should know that this can be realized through replication flows. Note that if data needs to be loaded into Datasphere’s Object Store (in HDLFS-format), customers should use the ABAP adapter over the classic ECC adapter. However, my advice would always be to consider when (not if) you will migrate your legacy ERP system to S/4HANA in order to guarantee the prospect of a healthy future data landscape.
Data Products & Intelligent Applications
These two objects form the foundation of BDC as a solution and hence, unsurprisingly, there are many questions on these topics as well.
First of all, it should be noted that Intelligent Applications are licensed separately from the main BDC license; customers do not receive them as a form of business content. The Intelligent Applications that have been released up until today only contain data coming from either S/4HANA or SuccessFactors. Customers can create their own Intelligent Applications similar to how Custom Data Products are build; inside SAP Analytics Cloud (SAC) and Datasphere, respectively. This means that for Intelligent Applications, the Data Products you want to use will have to be installed in SAP Datasphere first.
To create Custom Data Products, SAP will soon introduce the Data Product Studio, which will (very importantly) also be able to use non-SAP data (as was shared in the recent SAP BDC Live Workshops). This separate application, which will become part of the BDC Cockpit, will allow customers to also share Custom Data Products with SAP Databricks (through Delta Share), whereas current Custom Data Products can only be leveraged in SAC. It will be interesting to see if Enterprise Databricks (stand-alone) and Google BigQuery as both mentioned above, will also be able to fully leverage Custom Data Products as created via the Data Product Studio.
Now, what about custom fields in the source object, for example, S/4HANA CDS views? SAP has stated that by default, Data Products will not include any specific ‘Z-fields’. However, if a customer onboards a (Custom) Data Product through BDC’s Foundation Services, the metadata for the source views is analyzed and subsequently harmonized with the Data Product in SAP Datasphere, enabling it to retrieve the contents of custom fields as of that moment. So long story short; if you custom fields are part of the source object’s metadata definition at the time of the respective Data Product’s onboarding, you will be able to leverage these custom fields. Note that only installed Data Products (and thus, replicated data!) support this feature; custom fields are not carried over to Intelligent Applications that are delivered as SAP-managed content.
Making the step: migration paths
To close off, let’s say that you are a potential Business Data Cloud (or Datasphere) user who needs to make the step from SAP BW. You see the value in the combination of semantically rich Data Products combined with Delta Share and strong SAP integration. What are your options and how? In the overview below I have shared a high-level visual on what paths are possible from BW to the next of generation (SAP) data warehousing.
In short: the best way to move forward is to either commit to long-term on-premise through SAP BW/4HANA, or to (gradually) make the move to the Cloud by taking the BW PCE – BDC route. My take is that SAP BW 7.5 on HANA PCE is not a smart/sustainable solution, but a temporary situation that buys customers time to ponder their future data landscape going forward. In SAP terms this would be either SAP Datasphere (as part of BDC), SAP BW/4HANA or SAP BW/4HANA PCE (in BDC). For more details on specific options, you can either check SAP’s migration assessment or read more about BW PCE in detail.
If you want tailored advice and quality in-depth support to continue your organization’s data and analytics journey, do not hesitate to contact Expertum. Alternatively, feel free to read our other blogposts on SAP Business Data Cloud or download our SAP Datasphere White Paper here.