Establishing a strong DevOps pipeline

Having a strong DevOps pipeline is increasingly important for business creating software inhouse. What can CTOs do to ensure a steady flow?

DevOps aims to shorten the systems development lifecycle and provide continuous delivery with high quality software. Information Age looks at how organisations can build a strong DevOps pipeline for the benefit of the business.

A strong DevOps pipeline

Subhash Ramachandran, senior vice president of product management at enterprise software firm Software AG, says software development is now a priority for most organisations.

“DevOps, and more recently DataOps, has pushed software development to the front of most corporate IT roadmaps. The rise of DevOps has been well explained as a new approach to make monolithic applications more agile and responsive to market and workforce changes,” says Ramachandran.

“There are so many different patterns in data integration – from batch, to streaming and beyond – that a patchwork landscape of technologies has led to huge fragmentation. Data engineers without the right tools end up stressing about constantly pivoting to keep things in sight and steady, which is a drag on resources,” Ramachandran adds.

“Therefore, DevOps is only useful if businesses can interpret and take action on the data. Organisations must have a strong pipeline in place to manage the incoming data and this is where application integration comes in. Having a powerful integration platform can automatically manage the DevOps data pipeline to provide better visibility and insights, real-time engagement with customers, and frictionless partner and supplier transactions.”

‘In too many organisations DevOps teams have to choose between development speed and the quality of their code’

Greg Adams, regional vice president, UKI Dynatrace

Frictionless DevOps pipeline

“In too many organisations DevOps teams have to choose between development speed and the quality of their code. To create a strong DevOps pipeline, IT leaders need to remove this friction through automation,” says Greg Adams, regional VP for the UK and Ireland at Dynatrace. “Augmenting the skills of software engineers with AIOps [artificial intelligence operations] reduces the need for them to manually conduct routine, highly repeatable tasks in the delivery pipeline.”

This enables DevOps teams to focus on writing code rather than conducting the checks and balances, so they can accelerate the delivery pipeline, while still maintaining quality.

“We’re seeing this being prioritised, with rising investment in automated CI/CD [continuous integration/continuous deployment] pipelines, shift-left security, and AIOps-driven root-cause analysis, helping to boost developers’ confidence in the quality of their software releases,” Adams says.

Infinite cycle

Mariusz Tomczyk, senior DevOps engineer at STX Next, says a successful DevOps pipeline should be seen as a process.

Tomczyk says: “There are no strict rules but most of the time the perfect pipeline is an infinite cycle of the following stages: plan, code, build, test, release, deploy, operate and monitor – it is a process.”

But how can this process be made strong?

“It’s important to first establish what ‘strong’ means. The answer might be a pipeline that can integrate changes developed by multiple developers into a central, shared code repository as often as possible. Whatever you do, establishing a good plan is the first thing to do.”

A good plan shapes the entire workflow before developers start coding and move onto the next stages of the process.

“Gather user stories, split the work into the small pieces that can be easily handled by the team and prepare a roadmap,” says Tomczyk. “This will guide the team along the process making delivery of the features easier.”

Compliance

Harbinder Kang, global head of developer operations at Finastra, says regulatory regimes must be considered in the pipeline process.

Kang says: “For businesses operating in regulated industries, such as financial services, organisations must ensure their applications are compliant and secure, as well as reliable. Ensuring the right tools, technology and quality checks are in place is essential, but this requires intervention and monitoring from domain specialists, such as site reliability engineers [SREs] and regulatory and compliance specialists.”

He adds: “With a platform model, quality gates and regulatory frameworks can be built in, providing service teams with the confidence that applications will be compliant and meet the required quality standards.

“Standardised tools and technologies, consumable via APIs, eliminate the need for specialist engineers to manage and maintain their own technologies and tools, as well as the need for developers to fully understand them, as the platform’s in-built guardrails will prevent non-compliant apps from reaching the customer environment.”

Data backup

Michael Cade, senior global technologist at backup provider Veeam, says data must be protected in the process. He says: “A strong DevOps pipeline is continuously deploying code at scale. But, increasingly, we are seeing a closer tie between code and data, in particular, code being deployed that can affect and change data.

“A strong modern DevOps pipeline needs to automate backup policies to mitigate the risk of losing or modifying data with each version change. This can be done by incorporating backup actions into your pipeline to ensure that any code changes are secure – this can allow even the largest teams to automate safely at scale.”

Databases

Ales Zeman, senior manager for pre-sales and professional services for EMEA at enterprise solutions provider Quest Software, says databases must be considered for the pipeline too.

“DevOps can accelerate every stage of the software development lifecycle, but DevOps is more than just the deployment of tools, it is about organisations embracing practices and building a culture around agility across all IT teams. The practice depends on collaboration, automated processes and technology working in harmony, but often falls short due to poor data visibility and control within the business,” says Zeman.

“As data becomes an increasingly strategic asset, enterprises turn to data operations to maximise the business value of the data and its underlying infrastructure.”

Data plays an important role in successful DevOps adoption, and the overall approach to designing, building, moving and using your data, both on-premise and in the cloud, is a key to the digital transformation of the company.

“Therefore, organisations that apply DevOps practices like continuous integration and continuous deployment to react more quickly to changes in the business, must weave database development into the fabric of any DevOps plans,” Zeman says, “in order that it integrates with CI/CD automated processes, otherwise a significant bottleneck could be created that would delay application delivery.”

Related:

The ultimate guide to DevOps: everything an enterprise needs to knowA strong DevOps strategy can help to break down the silos between business and IT, necessary to maintain productivity, efficiency and security in this new age of remote working, which is why Information Age has produced this DevOps guide

DevOps vs Agile: distinguishing and combining the twoAgile development means fast, practical delivery of software while DevOps handles its practical rollout to users. Put together, they’re both sides of the same coin

The most vital roles to fill in DevOpsWhat are the vital roles when it comes to DevOps? With a growing IT skills shortage, Antony Savvas considers the most important jobs in the sector

The most important DevOps tools for tech workersThere are so many DevOps tools out there to help you deliver, it can feel overwhelming. Github or Gitlab? Openshift or Kubernetes? Antony Savas talks to DevOps experts about their favourites

Related Topics

DevOps
DevOps Engineers