Migrate from Oracle to Postgre by automating the process!

 

 

 

Migrate from Oracle to Postgre by automating the  process, for a fixed price  !

Many companies are leaving Oracle for PostgreSQL

 

Oracle Database has a reputation for being expensive in terms of licensing and support.  This pricing policy may make Oracle less attractive to some companies. Bloomberg notes a drop in sales of new licenses at Oracle. A decline which has continued for 7 quarters. 

 

Opposite, we have the PostgreSQL database,  which is free, which can be modified and distributed, which has the reputation of being reliable and stable, with good transaction management... All these elements make it suitable for critical applications where data consistency is essential. PostgreSQL additionally supports a wide variety of data types, including geospatial, JSON, etc., making it suitable for handling complex and varied data.  

 

Postgre is now the 4th most used database in the world, with significant growth rates .

 

But this remains a movement that is difficult to initiate.

 

According to Carl Olofson, research vice president at IDC, "There are a number of Oracle users who would like to try PostgreSQL for at least part of their workload, but are discouraged by the risk and expense." cost of conversion.

It must be said that PL/SQL, Oracle's procedural language, began to be used in 1991. It is the most common procedural language in the world!

It offers the ability to write complex data manipulation functions without using a third-party language. As it is a query language, it allows you to create fairly complex scripts: functions, procedures, triggers, apex, etc.

PL/SQL cannot be migrated as is to Postgre. In any case, not simply. These migration projects   are often long and costly projects, and sometimes failures. 

 

But not necessarily  😊 .

 

We have developed a proven methodology, in 2 stages, to mechanically make this migration a success, on a flat-rate basis when the Oracle DB is essentially used as a data warehouse with PL chains.

 

Step #1: Simplify the source system

The analysis of logs by  {openAudit} ,  our software, makes it possible to detect the information that passes through the dataviz tools and in all queries (JDBC, ODBC, etc.).

The {openAudit} data lineage  , by tracing all the flows that generate real use, makes it possible to define the useful vs. useless parts of the Information System.  It becomes possible to carry out massive decommissioning ahead of the migration.

Fine introspection of flows in databases allows  {openAudit}  to factorize the code.

Step #2: technically migrate from Oracle to PostgreSQL

{openAudit}  will “parse” the PL/SQL, it will break down all the complexity of the code using a grammar allowing exhaustive and ultra-granular analyses.  All the subtleties of PL/SQL will be taken into consideration. 

{openAudit}  deduces the  overall kinematics  and intelligence, which will be reconstructed in an agnostic algorithmic tree (it could be simple Scratch).

On this basis,  {openAudit}   will produce standard SQL.

Then the intelligence will be reconstructed at a minimum in PgSQL by  {openAudit} .

Certain complex processing, not reproducible in simple SQL, will be driven by a NodeJS executable. Typically “For Loop” cursors, variables, “If Else” conditional code, “Switches”, procedure calls, etc. 

 

Eventually, new orchestration mechanisms can be implemented to deconstruct the cursors of cursors (loops of loops), or to optimize the transformation chains*.

Thus, in general, it is possible to decommission Oracle assets and reproduce all of their intelligence in PostgreSQL in an extremely efficient manner. 

 

We are committed  : migrations  are  carried out on a fixed price basis.

*There are some limitations:  Unit procedures called by external code or triggers  may be subject to ad hoc responses.


Commentaires

Posts les plus consultés de ce blog

La Data Observabilité, Buzzword ou nécessité ?

BCBS 239 : L'enjeu de la fréquence et de l'exactitude du reporting de risque

Le data lineage, l’arme idéale pour la Data Loss Prevention ?