Move from Teradata to GCP?

 


 
Move from
Teradata
to GCP?  
 
The challenge taken up by ADEO/Leroy Merlin
 
 
The release of Teradata - a technical challenge 
 
 
Teradata is an “appliance” chosen by countless players. Teradata's specialization in Datawarehousing / Analytics has enabled the implementation of solutions with exceptional computing capacities and strong scalability. But most players are now moving towards the Cloud . And few have opted for Teradata Vantage, Teradata's Cloud solution.

The ADEO/Leroy Merlin group (120,000 employees worldwide) has decided to switch its Teradata assets to Google Cloud Platform (Big Query), as well as to migrate a certain number of data visualization technologies (SAP BO, PowerBI) - d other technologies that need to be supported on the target platform (Looker, Data Studio). 

But the technical difficulties of such a project are numerous:
  • How to define in continuous time what exists with all its dependencies;
  • How to know what has been migrated, what remains to be migrated, how to compare the responses;
  • How not to create “load breaks”;
  • How to enable data engineering teams to collectively control the migration process; ...
 
Step #1: 
Master Teradata for controlled decommissioning. 
 
 
 
openAudit® will analyze the Teradata source platform on a daily basis:

By relying on parsers and probes that work continuously,  openAudit® allows an ultra-granular daily analysis of Teradata assets ,  as well as all the data visualization solutions connected to it.
 
In detail, openAudit® will highlight: 
  • Internal processes  via physical data lineage, in the field, in Bteq, but also Stambia, Views, Macros, other scripts associated with feeding flows; 
  • The uses of information,  via an analysis of audit database logs  ;  
  • Task scheduling  ; 
  • The impacts in the data visualization tools  which are associated with Teradata (in this case: PowerBI, SAP BO...), to glimpse the related complexity (calculation rules), and to be able to do data lineage truly from end to end. end.
     
    Step #2: 
    Master the deployment in GCP at the same time, for a harmonious ramp-up. 
     
     
     
    openAudit® will at the same time continuously analyze the target platform - GCP:
     
    In the same way as the source platform, openAudit® will carry out various actions on the target platform to measure its evolution during the migration, and beyond:  
    • Dynamic analysis of BigQuery , scheduled queries, view scripts, and loading files such as Json, CSV, etc., to build intelligent flows; 
    • Analysis of logs in Google Cloud's Operations  (Stackdriver) to immediately know the uses of the information; 
    •  Introspection of certain “target” data visualization technologies  that rely on GCP (Looker, Data Studio, BO Cloud, PowerBI, etc.), to be able to reconstruct the “intelligence” by comparing the responses; 
     
    Furthermore, connectors can be migrated to bigQuery (case of connectors with performance deterioration via datometry's hyper-Q middleware). 
     
     
    For an optimal understanding of changes in the information system, we provide a multi-platform map (source and target), which will present flows and uses with a variable level of granularity:
       

       
      Conclusion : 
       
      We do not think that a migration of such ambition can be organized through "kick offs" and "dead lines", but in an intelligent process which is based on a real mastery of the source platform / and of the target platform, via continuous technical introspection of processes and uses, and a graphic representation of "the" information systems, which everyone can understand and exploit.

      Commentaires

      Posts les plus consultés de ce blog

      La Data Observabilité, Buzzword ou nécessité ?

      Migrer de SAP BO vers Power BI, Automatiquement, Au forfait !

      BCBS 239 : L'enjeu de la fréquence et de l'exactitude du reporting de risque