Finding the right database to power your post-COVID recovery

Though there will probably be a bump or two along the road, businesses are slowly but surely emerging from the long shadow of the pandemic: The World Bank believes the global economy is set for the fastest climb out of recession seen in 80 years, for example.

At the same time, business may have been changed for good by the global health crisis, and that’s a lot more than if we’re going to go back to the office full time. A fundamental and irreversible shift to using digital technology in order to achieve digital transformation started during COVID, and is only accelerating; McKinsey reports that for many companies, digital projects earmarked for the next seven years instead happened in a matter of months.

What got you here, won’t get you there

The CIO has been a critical player here, turning to a more and more aggressive use of the cloud — in many ways, digital transformation is ultimately getting computing power and, to a large extent, service, to customers via it. The problem: to quote business guru Marshall Goldsmith, “What got you here, won’t get you there.” If we really want to use the cloud to get to the next level in the digital economy, we’re going to have to face up to some home truths about the way we run business systems in cyberspace.

What do I mean by this? As is very well-known, the first generation of cloud migration was all about turning CapEx into OpEx. And that was great, except we soon found you can’t take a big IBM or Oracle database and just put it on the cloud. Lift and shift is inherently risky anyway, but also just isn’t technically feasible to do it for large, monolithic applications.

The way the first generation of enterprise cloud engineers worked round this was by trying to break these big systems down into lots and lots of little things — hence virtualisation (you put everything you could into non-local servers in a data centre) and containerisation (the idea that rather than having to install every piece of software onto an operating system on a virtual server on a piece of tin, you could kind of put it into a container, VMs in the first place, and soon Docker, etc.). This is probably where most people are now; using technologies like Kubernetes to link lots of orchestrated containers. Most people are also now more or less convinced Agile users, which is a really nimble way of making all this complexity work.

Starting strong when building your microservices team

Suraj Kumar, general manager for integration & API at Software AG, discusses how to go about building a microservices team. Read here

NoSQL was great, but it didn’t get us all the way over the finish line

The problem is that we never really fixed the database side. The idea of going to the cloud and making the dream of your service being available to everyone in any time/any place/any device, was that the database would always be available (you can’t afford to have it go down, you really want to avoid preventative maintenance, and lengthy upgrade cycles aren’t great for the online CX). Before you say it, yes, this is the problem distributed NoSQL databases solve; you can deploy them on hundreds of nodes on the cloud, distribute them across different regions; and you get resilience.

I am happy to agree that the NoSQL revolution made it possible for those old monolithic databases to be made usable in the cloud. But they did that by throwing a rather important baby out with the bath water, they can’t do transactions. Putting it into smaller chunks in a microservices approach was a brilliant move. Each one of those has a database that serves it, and in order to do that you end up with a lot of NoSQL databases.

Why is that a problem? When I buy something or I move money from A to B, it’s got to work, at both places, and at the same time. If I stick something in my online shopping basket and it gets lost, that’s a bit inconvenient, but not a real disaster. But, when you’re telling the bank to pay for that order to be delivered, it had better work and balance my account instantly. That ability was taken for granted in transactional systems until the cloud revolution started, and then we had to start writing enormous amounts of extra code, thousands of lines of code, with all the complexity, vendor maintenance upgrades and costs—where, ten years ago, one line of SQL could do the job perfectly for you.

Head in the clouds: Examining the Database-as-a-Service promise

Scott Anderson, senior vice-president, product management & business operations at Couchbase, examines the effectiveness of database-as-a-service. Read here

Truly mainframe-level transactions in the cloud

The call to action here is that as your organisation ramps up to start doing lots of post-COVID business at digital speed, there are going to be more and more use cases that you won’t be able to do just with a virtualised, microservices and NoSQL approach alone. At the same time, we can’t go back to the big old Oracle and IBM solutions needing all that compute and air conditioning in the corner.

To be a serious ecommerce player, you can’t program your way out of this (which has huge costs attached, but also greater risk, greater delay and overall greater expense). The solution has to be a low-code, cloud, distributed, transactional data approach that can supply the one piece of the puzzle left that NoSQL hasn’t been able to deliver: simple, reliable, instant and scale-out transactional support.

So I am sorry to say that for all your hard work in COVID and ramping up your brand’s digital transformation, to complete the promise of cloud, you need to go to the next level. But it’s a step you really need to take post-COVID, to get the same power and reliability relational database gave you, but in the operating system of the now: the cloud.

Written by Martin Gaffney, vice-president EMEA at Yugabyte

Editor's Choice

Editor's Choice consists of the best articles written by third parties and selected by our editors. You can contact us at timothy.adler at stubbenedge.com