On the road to containerisation? It’s time your database shifted gear
Tue 17 Mar 2020 | Anil Kumar
Organisations must ditch legacy databases to take DevOps to the next level
It’s no secret that DevOps teams typically have three main priorities: increasing agility and innovation, improving collaboration, and delivering products faster to market. It’s also no secret that they will be quick to adopt any technology that supports these goals.
Containerisation is a natural fit for this framework: increasing the scalability and dynamism of the cloud to develop and update applications faster, meet the ever-increasing demands placed on DevOps teams, and ultimately deliver better customer experiences. However, there are a few obstacles still standing in the way.
A game of catching up
Containerisation – often heralded as the next evolution of the cloud – continues the trend of offering new ways for DevOps teams to not only perform their core work, but also provide the applications they come up with.
But it often finds itself moving too fast for other technologies to keep up. For instance, while developers have embraced containers to make it easier to create cloud-native applications, the databases powering these applications haven’t always followed suit.
Legacy databases, while perfectly suited for their original tasks, have traditionally been tied to a single, bare-metal or virtual machine instance. They simply haven’t been designed to support applications in a containerised, highly-distributed, instantly-scalable cloud.
Some businesses will of course insist on using legacy databases to support containerised infrastructure and development, and it’s these organisations that will nearly always encounter three key problems:
- Ever-increasing operational costs – manually deploying and managing hundreds of database clusters across multiple geographies incurs high costs, effort, and complexity
- Vendor lock-in – a lack of standardisation to ensure data can be moved freely and safely between cloud providers has made it difficult to switch providers quickly or work with multiple providers
- Delayed time to market – customers with applications using microservices architecture have difficulties managing and scaling database clusters in siloed systems, extending development times and making it harder to support their applications
Time to go cloud-native
The majority of organisations are now realising how to move some, if not all, of their core applications to the cloud, and solving these data challenges with containers are critical for these projects to be successful.
As well as the immediate benefits for DevOps teams – such as being able to quickly create test environments to fine-tune new features and applications – cloud-native applications are simply more flexible and well-orchestrated.
The ability to create applications that can quickly migrate to more cost-effective infrastructure, or to meet compliance demands; that can quickly be recovered in the event of a disaster; and that can scale at peak times, such as at major events for travel and ticketing companies; or at major shopping times for retailers — these things are all crucial for modern organisations to thrive.
Any business that depends on legacy technology that cannot support these changes is likely to stagnate and ultimately be left behind by more agile competitors. In order to survive, most organisations will have to embrace technology that can support cloud-native applications across the entire stack.
Mistaken beliefs – and how to tackle them
If a business has put off adopting technology that can support the next generation of the cloud, it’s normally down to a few different factors: their database vendor doesn’t support, endorse, or allow it; database administrators don’t trust it; the storage is volatile; it’s faster and more stable on physical or dedicated virtual nodes; and so on.
These are all common and not irrational objections to containerising databases and have caused several organisations to stick with their legacy databases for perhaps longer than they needed to – after all, if the upgrade won’t meet your needs either, what’s the point in doing it?
Luckily for these organisations, there is something to help them support the next generation of cloud and allow containers to operate in tandem with a modern database. Why? Container orchestration systems are increasingly including the capability to support containers with distributed databases.
For example, DevOps teams can run their database as a fully managed stateful database application next to microservices applications on a single Kubernetes platform. Because Kubernetes makes it easier to manage and scale their database, it’s much faster and easier to put in place the right database capability to support new applications. This means they can concentrate on developing applications that will take full advantage of their database’s capabilities, safe in the knowledge that when the application is rolled out into full production, this capability will be there.
Not only this, but working through the same orchestration system as containers also means such databases can be more easily duplicated and distributed to support cloud-native applications, instead of being rooted to the spot. Organisations can also take advantage of pre-programmed responses that can be built into systems such as Kubernetes: for instance, a self-healing mechanism to recover from failures, or setting environments to automatically scale up and down when certain criteria are met.
Fit for the future
As the cloud revolution continues to gain speed, containerisation is possibly the biggest milestone of them all. At the same time, it lays bare an important point: revolutions will only benefit the business if the entire organisation is brought along.
In the world of IT, this means not only rushing to adopt containers and cloud-native applications but ensuring that critical infrastructure such as databases can be brought along for the ride. Organisations that can adapt, stay ahead of the curve, and effectively future-proof their business, will be at a huge advantage: not least because their DevOps teams no longer need to spend time firefighting or doing their jobs with one hand tied behind their backs, but can concentrate on improving the business instead.
Tags:containers database DevOps
DevOps Tue 17 Mar 2020DevOps is not just an excuse for the firing of developers
The DCIM maturity model: An important step to data centre efficiency and adde...
Read More >>
Analysing the distributed workforce
Read More >>
How to manage IT during a pandemic
Read More >>
Data strategy: paving the way to the cloud
Read More >>
Enabling AI with edge computing and HCI
Read More >>