Hatchfi Engineering Update: Getting Our Infrastructure Ready for SOC2 Type 1 Compliance Certification

Hatchfi Engineering Update: Getting Our Infrastructure Ready for SOC2 Type 1 Compliance Certification
Photo by Inês Pimentel / Unsplash

As engineers and founders, Casey and I always appreciated companies of all shapes and sizes that were transparent about their engineering journies and mishaps. Teams who love to share what they are building and who are quick to take ownership of anything that happens, good or bad, really bring a smile to our faces and build long-lasting trust between customers and businesses.

At Hatchfi, we want to ensure we are paying it forward by bringing the same transparency to our customers about our engineering efforts, both the good and the bad.  

So what's happening at Hatchfi, you may ask?! Well... A LOT!!!!

Infrastructure Migration to AWS Elastic Container Service

We are prepping our infrastructure for SOC2 Type 1 compliance certification to ensure that we provide financial institution-level security for all our customers. We must support everyone with enterprise-grade security and best practices, whether you are a single developer building an awesome side project or an enterprise financial institution where regulation, compliance, and security reign supreme.

For our new deployment, we will be using a combination of services, including AWS, Cloudflare, MongoDB, and more. At the top of the funnel, we are using Cloudflare for DNS record hosting and API endpoint security to make sure we cut abusers off quickly and with little effect on API performance. From here, we will be routing traffic into a secure multi-zone, multi-region AWS deployment running in a hot-cold regional failover fashion.

Proposed v2 Hatchfi Architecture

We decided to use ECS or Elastic Container Service with Fargate on AWS to deploy our API for optimal scalability, performance, reliability, and consistency. This deployment will be spread across three zones within a single region for additional durability and fault tolerance, in addition to entirely having a cold standby deployment in another region.  

For our database, we run a dedicated MongoDB cluster with a single primary and two secondary nodes across zones for additional resiliency, durability, and performance. These are also backed up several times a day into an S3 bucket so that we can quickly redeploy in the case of a regional failure.

To keep things running smoothly, we will be automating our entire infrastructure deployment using Terraform to make deploying dev, stage, and prod environments dead simple and fast. It's a heavy lift early on but will be very worth it in the long run regarding disaster recovery scenarios.

Latest API Updates

With the Hatchfi API architecture, we've been doing a lot of quality of life and performance improvement work over the past few months. Our most recent efforts have focused on architecting a message queuing system for better account syncing performance across wallets and accounts of different types and sizes. We found that a single queue was causing some accounts syncs to take ungodly long due to the sheer size of their transaction history, and failures to pull data, though rare, could occur.

Some of the benefits that we will incur through this architecture with a queuing system include, but are not limited to:

  • No more waiting on the Hatchfi Link; once a message has been added to the queue, we can pop them off and give a "success, we're syncing your account" message
  • It lets us implement webhooks for connection failures/success
  • Ensures that failed connections due to timeouts or similar errors can be retried

Regarding integrations, we now support over 27 exchanges, wallets, and protocols with a bunch more in the pipeline, which you can check out on our roadmap. We're focusing on adding additional exchanges, wallets, and protocols over the next several months, so keep an eye out. Also, please feel free to request new providers on our roadmap!

About Hatchfi

Hatchfi is a universal API for connecting to and retrieving your users' crypto financial data. Quickly & securely connect with our APIs to authenticate and aggregate data such as balances, transactions, and holdings from exchanges, wallets, and protocols.

As developers who experienced this problem firsthand when building a fintech tool for crypto, we hated the idea of spending endless hours and engineering cycles building and maintaining our crypto integrations with several different APIs and endpoints mashed together. All we wanted was a Plaid-like UI to easily connect our users to their crypto accounts along with high-quality data, but this seemed too good to be true, which it was.  

After not finding any suitable alternatives, we decided to pivot and make our own "PLAID FOR CRYPTO"!

Click the button below to get your API keys. Simply log into our dashboard to generate your project keys, and start developing with our docs!