Migrate from SAP BO to Power BI, automatically

 




 

 

Migrate from SAP BO to Power BI, automatically, on a fixed price basis


SAP had previously announced the end of life of SAP BO.

Finally, the roadmap has changed a little, since it will involve moving towards SAP BusinessObjects BI 2025 (which is currently a code name).

 

However, a list of components will not be included in the BI 2025 version, including Universe Design Tool (for building .unv), multi-sourced universes, etc. These components will remain in BI version 4.3, but maintenance support will end in 2025.  
And the rest is not necessarily very clear, even if SAP announces that it wants to focus on the most adopted tools.

 

Furthermore, “self BI” is popular with business teams, but also with IT, which is delighted to give teams autonomy.  

 

That being said, it is a monumental project that is looming, and which rightly frightens the teams: some of our clients have more than 100,000 BO dashboards in production. The movement could be very long and excessively expensive. 

 

With this in mind, Ellipsys has created a migration solution for the dataviz layer which relies on data from the {openAudit} solution .

This functionality is at work for a major client, in association with our SAP BO layer cleaning/simplification module. 

 

In this case, it will involve migrating thousands of SAP BO dashboards to Power BI, automating the entire process 😊.

 

Below are some highlights of our methodology as part of this project:  

 

Automated analysis of the BO platform 

 

 

To analyze the complexity of the reports, we rely on the different tables which are generated directly by {openAudit} after the daily analysis of the SAP BO platform.

 

We generally collect the following information:

 

In terms of dashboard intelligence

  • The list of expressions and variables used and their level of nesting; 
  • The list of basic BO functions used in these expressions and variables;  
  • The sourcing (lineage) of each published element (whether directly in blocks, filters, input parameters, merged dimensions, alerters, etc.)

 

In terms of layout 

  • The number of tabs, nested blocks, sections;
  • The number and type of tables (HTable, VTable, XTable) and graphics in the layout;
  • The list of filters in the elements.

 

At the source level 

  • The number of data providers and related dimensions;
  • The number of unpublished Data Provider objects (useless for publication);
  • The contexts used;
  • The list of prompts used in reports;
  • Links between data providers, or between statements, for multi-statement data providers;
  • The definition of personalized SQL (overload or freehand SQL). 

 

{openAudit} will initially identify the dashboards to be migrated, the list of objects that will necessarily be migrated.

 

The idea is to build an optimized semantic layer in the target technology. 

 

From this list, we determine: 

  • The list of fields and therefore useful tables;
  • The list of joins to connect these tables;
  • The list of BO contexts to reproduce;

 

This first list will then allow us to gauge the complexity of the semantic layer to reproduce in the target:

  • List of impacted derived tables;
  • Nested or linked objects;
  • Aggregate objects (aggregate_aware);
  • Views and source SQL to modify to meet the target.

 

 

Migrating to Power BI 


Migration of dashboards and intelligence

{openAudit}  will convert the objects in the data model tables of each dashboard and organize the data prep to reproduce the intelligence of BO documents in DAX and/or Code M.

Certain optimizations are also implemented to reduce data prep.

Layout migration

{openAudit} converts the BO layout directly into the PBIX which will be exported to Azure. The organization of the components is also respected. The user also benefits from PowerBI's native functionalities (dynamic filters, slicers, drill, etc.) which enrich the user experience.

 

Commentaires

Posts les plus consultés de ce blog

La Data Observabilité, Buzzword ou nécessité ?

BCBS 239 : L'enjeu de la fréquence et de l'exactitude du reporting de risque

Le data lineage, l’arme idéale pour la Data Loss Prevention ?