Share
Tweet
Share

VOO - BI and Big Data Transformation with a migration to the Cloud

VOO - one of the largest telecom operators in the Belgian market - has produced Memento. What is it about? It's VOO's Business Intelligence and Big Data transformation program with a key migration to the Cloud. Find out how we assisted the operator during the different phases of the project.

Who is VOO?

VOO is the name of a Belgian telecom operator. It is mainly active in the Walloon and Brussels regions. Providing its customers with cable, telephone and Internet services, VOO is customer and technology oriented.

You and your family can enjoy super-fast broadband Internet, a wealth of TV content and generous mobile and fixed-line service thanks to VOO's innovative products and services. This is a Belgian telecom operator that Lucy's experts have helped with its digital transformation.

Challenges

As part of a global transformation, we implemented a complete migration of VOO's Business Intelligence, Big Data and AI landscape to the Cloud. This migration was critical to address strategic and pressing business needs such as:
 
  • Dramatically increase customer knowledge to accelerate acquisition and improve loyalty and retention.
  • Support digital transformation by providing a unified view of the customer and their behavior.
  • Responding to the new challenges of compliance (GDPR)
  • Drastically reduce the total cost of ownership of data environments (4 different BI environments + 3 Hadoop clusters before the transformation).
  • Introduce enterprise-wide data governance and solve the shadow BI problem (25+ FTEs on the business side to process data).

The solution

Lucy's experts conducted a quick study, analyzing all aspects of the transformation and addressing both the organizational challenge (roles and responsibilities, teams and skills, processes, governance) and the technical challenge (holistic architectural scenarios, ranging from hybrid cloud to full cloud PaaS solutions).

Based on the results of the study, we deployed a cloud-based, enterprise-wide data platform. It combines traditional BI processes with advanced analytics capabilities. We redefined the data organization and associated processes and introduced enterprise-level data governance.

Total cost of ownership has dropped to less than 30% of what it was before. Agility and capabilities have improved dramatically. 

A cloud-based, enterprise-wide data platform powered by AWS

Architecture based on AWS key data services

A2C reference VOO
Data Lake

Amazon S3 is used for the core entry layer and to provide long-term persistence.

Some data files are pre-processed on Amazon EMR. EMR clusters are created on the fly several times a day. The clusters only process new data that arrives in S3. Once the data is processed and persisted in an Apache Parquet format optimized for analysis, the cluster is destroyed. Encryption and lifecycle management are enabled on most S3 buckets to meet security and cost efficiency requirements. Data is currently stored in the data lake. Amazon Athena is used to create and maintain a data catalog and explore the raw data in the Data Lake. 

Data Warehouse

The data warehouse runs on Amazon Redshift, using the new RA3 nodes and follows the Data Vault 2.0 methodology. Data Vault objects are highly standardized and have strict modeling rules, which allows for a high level of standardization and automation. The data model is generated from the metadata stored in an Amazon RDS Aurora database.

The automation engine itself is built on AWS Step Functions and AWS lambda.

DynamoDB

Amazon DynamoDB is used for specific use cases where web applications require sub-second response times. Using DynamoDB's variable read/write capacity allows the more expensive high-performance read capacity to be provisioned only during business hours when low latency and fast response time are required. These mechanisms, which rely on the elasticity of AWS services, are used to optimize the monthly AWS bill.

Machine Learning

A series of predictive models were implemented, ranging from a classic churn prediction model to more advanced use cases. Amazon SageMaker was used to build, train, and deploy the models at scale, leveraging the data available in the Data Lake (Amazon S3) and Data Warehouse (Amazon Redshift).

And much more!

The data platform we've built offers dozens of other capabilities. The huge set of services available on the AWS environment allows us to address new use cases every day, quickly and efficiently.

Mentions in the press

Learn more about this disruptive project successfully carried out by Micropole? Read the following article!

Use cases

Telecom - VOO - Transformation & Migration to the Cloud

Telecom - VOO -...

Transformation for the customer experience.
lucy in the cloud
Discover our AWS Center of Excellence:

Lucy in the Cloud
AWS Public Sector Summit 2024: A dive into innovation and sustainability

AWS Public Sector Summit 2024: A deep dive into...

Explore the latest innovations and commitments to sustainability at ...
"Data is essential to facilitate this transition." | De Tijd

"Data is essential to facilitate this...

Find out in this interview how De Lijn uses data to work...
Blog - Real-time analysis with Microsoft Fabric and Azure Event Hubs

Blog - Real-time analysis with...

Find out how to connect your Microsoft Fabric KQL database (Kusto Qu...
ACCELERATE WITH US
Are you passionate about data?

Contact us