An experience with Database Modernization: From Oracle for Aurora PostgreSQL – A Cloud Perspective

Carlos Moreno - Aug 2 - - Dev Community

Hey cloud folks! Recently, As an Solution Architect i had the chance to work on a pretty cool project with a customer. They were running a whopping several Oracle databases on Amazon RDS for their core services, and they wanted to switch things up to Amazon Aurora PostgreSQL. Why? Cost savings, better performance, and the flexibility of open source – the usual suspects!

Now, migrating code and schemas from Oracle PL/SQL to PostgreSQL isn't exactly a walk in the park. It's like translating between two languages with different dialects, data types, and grammar rules. We had to tackle stored procedures, packages (with functions and SPs), functions, and even some quirky Oracle-specific stuff. Automatic conversion tools like AWS SCT are lifesavers, but there's always that "last mile" that needs some hands-on attention.

Schema Convertion Tool

The Approach: A Mix of Automation and Expertise

We rolled up our sleeves and put together a crack team of a mix of Database specialists and AWS experts who knew their way around both AWS and Oracle/PostgreSQL. We used a phased approach, starting with a deep dive into databases landscape. We mapped out dependencies, prioritized schemas, and even did some code spelunking to see what we were up against.

For the actual migration, we leaned on AWS DMS (Database Migration Service) and SCT (Schema Conversion Tool) to automate as much as possible (almost 50%). But we also had to get our hands dirty with some custom scripts and even used some fancy GenAI agents to help us out with code conversion suggestions. Think of them as our AI-powered code whisperers!

SCT Analysis Results

Phase 1: The Scouting Mission – Detailed Evaluation and Planning

Before we dove headfirst into migrating client's databases, we took a step back and planned our attack. We mapped out a detailed migration plan, complete with a realistic timeline (remember, this is a time & materials gig, so flexibility is the key!), assigned responsibilities, and a backup plan in case things got bumpy. We wanted to make sure everyone was on the same page and knew their role.

We also did a deep dive into client's database ecosystem. We figured out how the databases were connected to each other and to the applications they supported. We sorted the databases by how important they were and how tricky they might be to migrate. We even peeked under the hood of some custom code to get a sense of the manual conversion effort we were in for. Think of it as a pre-flight checklist before takeoff!

Phase 2: Gearing Up – Preparation for Database Modernization

With our plan in hand, we started setting up our toolkit. We configured AWS DMS and SCT, but we also wrote some custom scripts to handle the more complex stuff.

We even had some AI buddies helping us out with code conversion suggestions – hey, we're not afraid to embrace new tech!

Amazon Q Developer in action

To make sure our plan was solid, we ran a pilot migration on one of the less complex schemas. This was our chance to test our tools, our manual conversion strategy, and get feedback from client's team. It's like a test drive before buying a new car, right?

A very Goog Manual Conversion Strategy:

Alright, let's talk about the nitty-gritty of manual code conversion. Since the automated tools couldn't handle everything, we had to roll up our sleeves and dive into the Oracle code ourselves. Here's how we tackled it:

- Analyzing and Planning

First, we put on our detective hats and really got to know the Oracle code. We combed through every line, looking for clues about the business logic, dependencies, and any Oracle-specific quirks. We talked to the developers and users to get the inside scoop, and we even drew up flowcharts to visualize how everything worked together. It was like piecing together a puzzle!

- Converting Functions, Procedures, and More

Next, we started translating the Oracle code into PostgreSQL. This meant replacing things like implicit cursors with explicit loops or SELECT statements, and adapting exception handling to PostgreSQL's syntax. We also had to carefully check data types and make sure they were compatible. It was like learning a new language, but with code instead of words!

SCT Analysis Results for Databases Objects

- Documentation and Optimization

Once we had the code converted, we meticulously documented all the changes we made. We also looked for ways to optimize the code and make it run even smoother on PostgreSQL. Think of it as fine-tuning a race car for peak performance!

This manual conversion process was definitely a challenge, but it was also super rewarding. It's a bit like restoring a classic car – it takes time and effort, but the end result is a thing of beauty that runs like a dream.

Phase 3: Full Speed Ahead – Database Modernization at Scale

Once we had the green light from the pilot, we cranked up the migration to full speed. We migrated the databases in batches, taking advantage of AWS's scalability to get things done faster. We also kept a close eye on performance and made sure everything was running smoothly.

After the migration, we did a final round of testing and optimization. We wanted to squeeze every ounce of performance out of Aurora PostgreSQL and make sure client's team was happy with the results. We even gave them some training on how to manage their shiny new database environment.

Lessons Learned

One thing we learned is that schema complexity matters. All of Lala's databases were labeled "Very Complex" by the tools, so we knew we'd have our work cut out for us. We also found that about 55% of code objects needed manual conversion – that's a lot of coffee-fueled coding sessions!

The Payoff: Happy Cows, Happy Devs

In the end, the hard work paid off. Client getting those sweet cost savings they were after, and their databases are humming along on Aurora PostgreSQL. Plus, they've got a more modern, scalable platform that's ready for whatever the future holds.

If you're thinking about a similar database modernization project, my advice is to plan carefully, use the right tools, and don't be afraid to get your hands dirty with some manual code wrangling. Oh, and maybe keep a coffee pot handy!

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Terabox Video Player