Global EN

2 minute read

The Challenge

The client’s legacy system was handling three billion transactions a day, with over 10 petabytes of stored data, plus 4,500 reports and dashboards for 27 business units. But this was showing limitations – and many of the 1,200+ technical users had to work around these.

Complex architecture had multiple failure points that required numerous manual interventions, meaning an average system availability of 85%. Most data were in older formats, and there were multiple copies and expensive license commitments, along with data security challenges. Slow, expensive and cumbersome technology resulted in low end-user satisfaction, slow business reporting and, at best, next day reporting.

The Solution

Rearchitecting the system significantly reduced complexity, with the Databricks Lakehouse solution for unified Delta lake on AWS virtualization enabling 9,000+ ingestion jobs to be reduced to a single one – with data availability also increased to 99%. There were significant performance gains, with a 10x performance increase in datamart processing, and an 80% reduction in new data ingestion time and production incidents.

There is now roles-based access control, with private accounts for finance and HR. Costs have been cut, with reduced data duplication. Licenses have been eliminated, while the burden of ongoing support has been lessened and development costs have been cut.

The Results

10x increase in datamart performance; 80% reduction in production incidents; 50% reduction in execution time; and an 80% reduction in new data ingestion
Time to process 1 billion records, reduced from 16-24 hours to 2 hours
20% reduction in development costs

Our Process

End to end delivery, from analysis and design, to delivery and support.

Innovation and design: We’ve pioneered, designed and developed data quality, and logging and alerting frameworks, providing input on lessons learned from previous architecture projects.

Development: We’ve developed key architectural features, including CI/CD pipeline, a terraform-based framework deployment process, data quality, logging and alerting frameworks.

Ingestion: We’ve helped analyze the incoming data for PII, configured the ingestion job to handle data ingestion for 10,000+ tables and validated the data for completeness.

DataMart: We’ve supported the migration and validation of 2,800+ Jobs and 3,600+ datamart tables, across 27 business units.

Support: Our data integrity team has monitored the health of ingestion jobs and datamarts.

About Data

1,300+

Data Experts

4

Specialist Data Domains

20+ Years

Data Project Delivery Success

Our Service Offering

Connect with us

CAPTCHA
Image CAPTCHA
Enter the characters shown in the image.
Yes, I would like to receive marketing communications regarding Synechron services and events.
I have read and agree to Synechron's Terms and Conditions and Privacy Policy .

Our Capabilities