Want to enhance your own DataOps knowledge? Want to learn how to use it to drive change across all departments? We mix our knowledge and consultancy experience to empower you. DataOps is a relatively new concept but learning it and the potential within it will give you a competitive edge.
We are launching our DataOps Courses, tailored for businesses looking to increase the real knowledge available to decision makers, deliver the user experience needed to drive value and to address sources of latency, risk and complexity in their data and analytics pipeline.
Our DataOps training will help you in the following areas:
• Regulatory Compliance: Address Regulatory compliance for BCBS 239, IFRS 9, CCAR, SFTR, GDPR, Dodd Frank MiFID II, Basel III / CRD, Solvency II, AIFMD.
• Optimised IT Estate: Migrate EUDAs, adopt a data-centric SLDC lifecycle, automate manual solutions, orchestrate your data pipeline to allow you to decommission legacy systems, provide a pragmatic and achievable path for Cloud/Hybrid migration.
• Reporting and Analytics Delivery: Solve data issues blocking the building of Reporting, Analytics, AI and Machine Learning solutions. Solve change bottlenecks through enablement of federated delivery and iterative adoption.
• Enterprise Wide Data Aggregation: Build enterprise views of core data such as Single Customer View without the dependency on building slow moving monolithic solutions. Maximise the value of your existing estate and enable clear path to simplification and consistency.
• Data Governance and Control: Build pervasive data management and governance capabilities as opposed to ‘one-off’ fixes, through embedded, efficient and sustainable capability. Govern and control your data lakes whilst maintaining project agility, combine the governance and lineage into the project process and architecture of the solution.
• Data Culture: Help employees understand how they can continually harness data to drive better decision-making and uncover untapped value.
2 hour workshop: walks you through the DataOps Methodology at a high level. Key takeaways: understand the six pillars of DataOps as a set of tools to measure your organisation’s maturity and plan for the future.
2-5 day course: complete with interactive exercises and case studies, the course is a definitive overview of all you need to know about DataOps. You can learn the trade secrets, pitfalls and most importantly how DataOps can benefit your progression and your organisation as a whole. This runs as either an introductory (2 day) or advanced (5 day) course depending on your level of maturity.
Provided by expert trainers with more than 60 years combined experience in delivering Data initiatives using our DataOps methodology. Not sure what DataOps is? Watch our video to understand why DataOps is increasingly growing in popularity:
If you are interested in talking more about how this can work for you, let me know if you would like to arrange a chat! If you are interested in seeing more content and tasters you can sign up for DataOps Course updates here.
For more information please click here.
Kinaesis have delivered our first DataOps Course for RiskCare, a financial services consultancy. Following on from our creation of the DataOps Thinktank, this new training course represents our latest contribution to the DataOps movement.
DataOps is a new comprehensive methodology for managing data pipelines and ensuring compliance, data quality and quick time to market for analytics.
The Kinaesis DataOps Course is an engaging, pragmatic toolkit, breaking down DataOps tooling and processes into pillars that each solve key data delivery challenge. Our course materials help enterprises to use DataOps to gain maximum value from the information they hold, taking full advantage of modern analytics while satisfying regulators, creating a culture of collaboration, improving control, helping delivery and reducing risk.
Sign up for DataOps Course updates here. Subscribe to be the first to have updates as to when the course is available and sneak previews of what is included!
We were delighted to receive great and helpful feedback and we would like to thank the team at RiskCare and for their expert comments and support. We look forward to taking it forward ahead of launching the course.
By Simon Trewin
What is Instrumentation all about? It is easiest to define through a question.
'Have you ever been on a project that has spent 3 months trying to implement a solution that is not possible due to the quality, availability, timeliness of data, or the capacity of your infrastructure?'
It is interesting, because when you start a project you don't know what you don't know. You are full of enthusiasm about this great vision but you can easily overlook the obvious barriers.
An example of a failure at a bank comes to mind. Thankfully this was not my project, but it serves as a reminder when I follow the DataOps methodology that there is good reason for the discipline it brings.
In this instance, the vision for a new risk system was proposed. The goal: To reduce the processing time so front office risk was available at 7am. This would enable the traders to know their positions and risk without having to rely on spreadsheets, bringing massive competitive advantage. Teams started looking at processes, technology and infrastructure to handle the calculations and the flow of data. A new frontend application was designed, with new calculation engines and protocols were established and the project team held conversations with data sources. But everything was not as it seemed. After speaking to one of the sources, it was clear that the data required to deliver accurate results would not be available until 10am which deemed it worthless.
The cost to the project included the disruption, budget and opportunity which was lost not only from the project team, but the stakeholders.
Instrumenting your pipeline is about:
• Establishing data dependencies up front in the project.
• Understanding the parameters of the challenge before choosing a tool to help you.
• Defining and discussing the constraints around a solution before UAT.
• Avoiding costly dead ends that take years to unwind.
Instrumenting your data pipeline consists of collecting information about the pipeline either to support implementation of a change, or to manage the operations of the pipeline and process.
Collecting information how data gets from source into analytics is key to understanding the different dependencies that exist between data sources. Being able to write this down into a single plan empowers you to pre-empt potential bottlenecks, hold ups and to also establish the critical path.
Data Quality is a key to the pipeline. Can you trust the information that is being sent through? In general, the answer to this question is to code defensively – however the importance of accuracy in your end system will determine how defensive you need to be. Understanding the implications of incorrect data coming through is important. During operation it provides reassurance that the data flowing can be trusted. During implementation it can determine if the solution is viable and should commence.
The types and varieties of data has implications on your pipeline. Underneath these types there are different ways of transporting the data. Within each of these transports there are different formats that require different processing. Understanding the complexity of this is important because it has a large impact on the processing requirements. We have found that on some projects certain translations of data from one format to another has cost nearly 60% of the processing power. Fixing this enabled the project to be viable and deliver.
Understanding the Volume along the data pipeline is key to understanding how you can optimise it. The Kinaesis DataOps methodology enables you to document this and then use it to establish a workable design. A good target operating model and platform enables you to manage this proactively to avoid production support issues, maintenance and rework.
Velocity or throughput
Coupled with the volume of data is the velocity. The interesting thing about velocity is when multiplied by volume it generates pressure, understanding these pressure points enables you to navigate to a successful outcome. When implementing a project, you need to know the answer to this question at the beginning of the analysis phase to establish your infrastructure requirements. For daily operational use it is important to capacity manage the system and predict future demand.
The final instrument is value. All implementations of data pipelines require some cost benefit analysis. In many instances through my career I have had to persuade stakeholders of the value of delivering a use case. It is key to rank the value of your data items against the cost of implementation. Sometimes, when people understand the eventual cost, they will lower the need for the requirement in the first place. This is essential in moving to a collaborative analytics process which is key to your delivery effectiveness.
Instrumentation is as important in a well governed data pipeline as it is in any other production line or engineering process. Let us compare to a factory producing baked beans. The production line has many points of measurement for quality to make sure that the product being shipped is trusted and reliable otherwise the beans will not be delivered to the customer in a satisfactory manner. Learn to instrument your data pipelines and projects, enables reduced risk, improved efficiency and the capacity to deliver trusted solutions.
Sign up for updates for our DataOps Course here.