With the explosion of Artificial Intelligence in the first half of last year, SAP’s focus in most of its software areas has been on integrating this technology into its solutions in a useful manner. The cost and effort of such an endeavor are anything but small, and thus developments in other areas have seen less leaps and more iterative steps ever since. Whether this is simply a strategic allocation of capacity or a sign that SAP’s solutions are perhaps maturing more quickly, is a point of view I leave to the reader. However, SAP Datasphere (DSP) has seen less major new functionalities released in the past few months, with the only noteworthy examples being Compatibility Contracts, Workload Management and Selective Deletion. So where does that leave us after this year’s TechEd? Read on to find out!
- Multilanguage support (especially with AI use cases in mind).
- Currency Conversion made available in more modeling artifacts.
- More extensive hierarchies (in different formats).
- Time dependencies (e.g. to determine the validity of specific data records).
- Variables in Datasphere models.
- Business Content (that can be (re)used for actual use cases).
The second big announcement for Datasphere was the support for Data Lake technology with SQL on files. This is something that the solution sorely lacked in our opinion, as competitors such as Snowflake had a significant leg up on SAP when it came to mass cold storage options. Curiously, SAP kind of positioned the Data Lake (for Datasphere) as a ‘soft’ replacement for BW(/4HANA) in terms of storage (obviously without the logic from BW(/4HANA)). Although we doubt whether this will be its practical use, the Lake functionality does allow data to be stored in its original state while maintaining the relevant business context. Note that it can also be scaled easily through SAP’s (newly “Integrated”) Object Store. The details on the implementation of this feature, which is to be delivered in Q4 of this year, are still somewhat vague. Having said that, we expect that this path will be similar to how data lake is currently offered through SAP HANA Cloud and that it will involve the Migration Cockpit. The Data Lake capabilities are a welcome expansion upon Datasphere however, and we will be eager to see whether it will also become possible to connect to non-SAP lake sources, such as Amazon’s S3 for example, as this would open up even more use cases.
Smaller announcements on Datasphere included more compute elasticity (and power) with Spark Compute; a feature that is currently present in SAP Data Intelligence and that will be a welcome addition to Datasphere. Strangely enough, we would have expected more visible efforts from SAP to close the functionality gap between SAP Data Intelligence (DI; which is now being deprecated) and SAP Datasphere. Although we will get the first Generative AI-enhanced workflows in Datasphere in Q1 of 2025, there is still quite a feature set that would be valuable to customers currently in DI (e.g. Jupyter Notebooks), but we will have to wait and see what SAP’s plans are in this regard.

All in all, we can conclude that SAP is focusing on three main development areas in the context of SAP Datasphere: the bridging of the functionality gap between DSP and BW(/4HANA), integration and exchange of data with (non-)SAP systems and the enablement of Generative AI. If you want to know what implications this may have for your IT landscape, or you are curious on how to make the most of SAP’s feature direction, don’t hesitate and contact us!
This blog about the highlights of SAP TechEd 2024 was written by our experts Lars van der Goes and Stefan Rijnkels.