Maximise the value of your data whilst reducing cost and complexity.
Many companies work in silos and business stakeholders are divorced from the data engineers and analysts. This means that organisations do not use their data effectively to achieve the best business outcomes.
DataOps is a new approach to delivering data, based on approaches that have proved successful in software delivery.
DataOps is more than a set of best practices; each pillar within DataOps is accompanied by tooling and methodology that blends lessons learned in the software world, with solutions appropriate for the more challenging world of data.
At Kinaesis, we specialize in applying DataOps in the financial sector, to achieve compliance, cost savings and revenue growth. Our approach to DataOps reflects the specific needs and challenges of our sector – and our own experience as longtime practitioners and SMEs.
To truly maximise the value of DataOps requires more than knowledge of DevOps and tooling; it requires a deep familiarity with data management, analytics, and with the sector's specific needs. That's what informs our specific interpretation of the pillars of DataOps:
Design: Above all, make sure the data pipeline is driven by a business vision, not by the data that just happens to be available. Map out user journeys and visions, and drive technical change from that.
Instrumentation: Instrument the data flow every step of the way, using profiling, DQ, and monitoring to create a clear view of how reliable and timely data really is. Always present data quality information along with actual data.
Global metadata: Maintain clear business definitions and models, keep them connected to your data, and above all, keep them up to date.
Collaborative Analytics: Make sure that data consumers can collaborate with each other, and make sure that as a group they can collaborate with IT and data owners.
Governance: Meet quality, attestation and audit targets by applying proper version control and proper release management. Channel the output of instrumentation into strong exception handling and DQ processes.
Extensible platforms: Data needs to be exploited and the business will always come up with new demands. Open standards, IT components that interoperate well, and clear contracts and operating models are essential to ensure that new demands can always be met.
Back in 2008, a group of tried and tested software development practices, together with some new tooling that made environment and release management easier than before, were bundled together as DevOps. DevOps is the principle inspiration behind DataOps, and from DevOps comes a concern for version control, release management, environment management, and documentation.
From Agile, DataOps takes an emphasis on collaboration, quick release cycles, and iterative refinement.
The focus on instrumentation comes from Lean; Lean methodologies propose that fully instrumenting a supply chain is necessary in order to optimise it. Our DataOps vision takes this a little further -- not just the supply chain itself, but the data, needs to be instrumented.
Finally, from User Experience (UX) development, comes a toolkit for generating vision and requirements by interaction and exploration with the business - essential if DataOps is to be more than just optimisation.