Our client needed to reduce the time to market for new analytical capabilities. Traditional data warehouses were taking too long to implement and did not have the scale to meet the biggest analytical challenges. They needed to build new capabilities to analyse disparate sources of data to build insight and knowledge.
They had created a data lake using Big Data technologies, and they needed to manage the data in a strongly governed and effective way. Their objective was to create a data cycle that enabled fast analytical investigation and fast, controlled production roll out to the broader user base.
Kinaesis worked with our client to design and implement a unified versioning and release framework covering data sourcing, data models and data content. This resulted in data sourcing code (ETL) always being released alongside versioned data structures and data onto a scalable hadoop infrastructure - ensuring alignment and consistency.
We helped the client to define a data environment with embedded governance framework to support the analytics development lifecycle from the data scientists' sandbox, to the production analytics and reporting system.
We worked with our client to help design and implement the processing of diverse data sources, into structured production data targets, containing consistently described data for running production analysis and reporting.
The client was able to build a consensus around the data and analytics strategy using big data to solve industry wide challenges and to help their clients to achieve efficiencies through the optimisation of their operations.