Enterprise data platform modernization
The client’s legacy system was handling three billion transactions a day, with over 10 petabytes of stored data, plus 4,500 reports and dashboards for 27 business units. But this was showing limitations – and many of the 1,200+ technical users had to work around these.
Complex architecture had multiple failure points that required numerous manual interventions, meaning an average system availability of 85%. Most data were in older formats, and there were multiple copies and expensive license commitments, along with data security challenges. Slow, expensive and cumbersome technology resulted in low end-user satisfaction, slow business reporting and, at best, next day reporting.
Rearchitecting the system significantly reduced complexity, with the Databricks Lakehouse solution for unified Delta lake on AWS virtualization enabling 9,000+ ingestion jobs to be reduced to a single one – with data availability also increased to 99%. There were significant performance gains, with a 10x performance increase in datamart processing, and an 80% reduction in new data ingestion time and production incidents.
There is now roles-based access control, with private accounts for finance and HR. Costs have been cut, with reduced data duplication. Licenses have been eliminated, while the burden of ongoing support has been lessened and development costs have been cut.
The client’s legacy system was handling three billion transactions a day, with over 10 petabytes of stored data, plus 4,500 reports and dashboards for 27 business units. But this was showing limitations – and many of the 1,200+ technical users had to work around these.
Complex architecture had multiple failure points that required numerous manual interventions, meaning an average system availability of 85%. Most data were in older formats, and there were multiple copies and expensive license commitments, along with data security challenges. Slow, expensive and cumbersome technology resulted in low end-user satisfaction, slow business reporting and, at best, next day reporting.
Rearchitecting the system significantly reduced complexity, with the Databricks Lakehouse solution for unified Delta lake on AWS virtualization enabling 9,000+ ingestion jobs to be reduced to a single one – with data availability also increased to 99%. There were significant performance gains, with a 10x performance increase in datamart processing, and an 80% reduction in new data ingestion time and production incidents.
There is now roles-based access control, with private accounts for finance and HR. Costs have been cut, with reduced data duplication. Licenses have been eliminated, while the burden of ongoing support has been lessened and development costs have been cut.
End to end delivery, from analysis and design, to delivery and support.
Data Experts
Specialist Data Domains
Data Project Delivery Success
Defining a vision for data use and how to get there.
Helping you make sense of your numbers and complex data.
Ingesting, moving and processing data in the most effective way.
Enabling you to tell meaningful stories through data exploration (including automation, dashboard design and graph modelling).
Providing an effective blueprint for your data environment, using technical design to support business strategy, design/deployment, and delivering modern approaches like Fabric.
Defining a vision for data use and how to get there.
Helping you make sense of your numbers and complex data.
Ingesting, moving and processing data in the most effective way.
Enabling you to tell meaningful stories through data exploration (including automation, dashboard design and graph modelling).
Providing an effective blueprint for your data environment, using technical design to support business strategy, design/deployment, and delivering modern approaches like Fabric.
Your message has been sent successfully. We’ll try to get back to you as soon as we can.