Migrating healthcare app databases from AWS Datica to independent AWS servers

03 Oct 2022 Viacheslav Kim, Marat Razzakov

A modern medical company was running a large healthcare platform connecting patients with medical imaging providers. They had databases for the admin app, patient app, physician app, and the original service portal. All these databases were running on AWS cloud servers deployed and managed by Datica. The latter helped the customer scale and manage their data infrastructure and had control and access to their data. However, the customer wanted to go beyond the Datica service offer. They wanted to gain full control over their data infrastructure.

They decided to migrate to local and independent Amazon servers while recreating all functionality and infrastructure in the new cloud location. Also, the customer wanted to complete the migration in less than 3 months before the new Datica subscription period began. Finally, they wanted everything to be completed with consideration of the best HIPAA compliance practices.

The ABCloudz team handles the task

ABCloudz helped the customer develop their medical management platform and they knew a lot about our approach. The customer was also attracted by our reputation of database management experts capable of handling the most complex data infrastructure migrations and data management implementations. Finally, we had proven HIPAA compliance expertise and were ready to apply our custom practices to handle the migration in less than 3 months, which was critical for the customer.

We used the database dumps to recreate most of the customer’s data infrastructure in the target location. We also applied VPC peering to retrieve some objects from the source databases. These objects were stored in the S3 Bucket and were later manually migrated to target databases.

As a result of our effort, we transferred the databases for admin, patient, physician, and service portal apps to the independent Amazon environmen. The ABCloudz team used AWS Organizations for database environment isolation. As a result, they received three environments for the customer’s apps, namely development, production, and stage. You can see these environments on the scheme below.

The ABCloudz team paid close attention to security considerations that become especially relevant in the healthcare domain where each failure to secure protected health information can lead you to substantial fines under HIPAA rules. That’s why we implemented Laika for security and compliance monitoring.

Meanwhile, Amazon CloudWatch was integrated for productivity monitoring and reporting. AWS ELB was applied for distributing the incoming traffic across the admin app, the service portal, the physician app, and the patient app.

We also applied AWS CodePipeline, AWS CodeBuild, and AWS CodeDeploy for CI/CD Pipeline. Finally, Cloud Formation helped us create the infrastructure in the AWS cloud.

A challenge with environmental variables

There were numerous challenges with recreating Datica database management features on the independent server. The most significant was related to environmental variables’ delivery. Datica provided its custom way of delivering environmental variables to containers, but it was no longer available after the transition to a new AWS environment.

At first we applied built-in AWS features for such transfers.

In this approach, there was a single secret in the JSON format with data required for transferring these environmental variables.

CloudFormation received variables from this secret and transferred them to CodePipeline. Then, the data was transferred through a transitional .yml file stored in the S3 bucket that stored docker commands (docker build, etc.) changing environmental variables in the containers.

This approach was time and labor intensive. The network of connections between a single data secret and CloudFormation was too complex and challenging to manage. Besides, additional CodePipeline and S3 bucket configurations were required while transferring environmental variables to the containers.

The customer’s administrators couldn’t handle this task independently. So they had to ask our specialists each time they wanted to transfer an environmental variable.

Check out this flow on the image below:

Custom solution for the challenge

We created a custom approach that makes the delivery of environmental variables to containers much faster and smoother. Now, each app has its own corresponding AWS secret.

We built a utility tool that is packed in the Docker image. This tool is called secrets_extractor. It retrieves data from the AWS Secrets store and stores it in the env. file. After that, the second utility tool, called envsubst analyzes all env. files, retrieves variables, such as API paths and analytical keys from them, and replaces placeholders within the Docker image with these variables.

We applied placeholders with double “{}” format to save time. The image below illustrates the correspondence between variables and placeholders written in this format.

Traditionally, you cannot change the variables in the static files (i.e. .html, .js, etc.) when they are already compiled. So, you have to recompile the app to change the variables, such as analytical keys or API paths.

Meanwhile, our placeholders are non-static. So, in our algorithm, the envsubst utility substitutes the placeholders with corresponding variables even when the app is compiled. We also applied the Setsubst utility for large variable sets. It helps us quickly substitute a big set of placeholders with a bunch of environmental variables.

This approach unifies CI/CD for frontends, allowing us to utilize a single artifact among multiple environments. Moreover, it enables us to adjust frontend configurations dynamically on the fly.

With this approach, no technical specialists are required to transfer these deliverables. The customer’s administrator can store an environmental variable in the corresponding secret and be sure that it will be successfully transferred to the container.

With the implementation of app-corresponding secrets, the system became much more organized and manageable. You can see its core principles in the image below.

Results of a successful migration

The customer got their data infrastructure migrated from Datica to an independent AWS environment. They now have:

  • A fully manageable data infrastructure
  • Environmental isolation for dev, prod, stage environments with AWS Organizations
  • Quality traffic distribution between the apps with AWS ELB
  • Quality security monitoring with Laika (security) and CloudWatch (metrics)
  • A fast and extremely convenient practice for delivering environmental variables to data containers (more than 50% faster than with built-in AWS tools)
  • The ability to change variables without the need to recompile the app.

Additionally, our custom practices enabled us to meet and exceed customer deadlines. As a result, the migration was conducted in 1 month instead of the expected 3 months.

ABCloudz for help with complex database migrations

This was only one example showing our rich expertise in migrating the databases and complex data infrastructures. The ABCloudz team can help you migrate from one AWS server to another while pushing AWS capabilities to the limit with our custom practices. Contact us to gain full advantage and control of your AWS infrastructure.

Ready to start the conversation?