Background in Data Modeling

In 2010, I read about Dan Linstedt and his work on the internet, having already become familiar with Data Vault 1.0 from project work. As I learned more about his work, I quickly became inspired by this way of modeling. At the time, I was working as a developer to support a Data Vault project implemented in Oracle. Fast forward to 2018, I found myself in Germany and came across a boot camp that would provide in-depth training on Data Vault 2.0. It also gave me the chance to sit for an exam to earn a certification. After the boot camp, I took some time to prepare for the exam and became certified as a Data Vault 2.0 Practitioner. Over the next few years, I expanded my expertise, working on Data Vault projects, consulting on POCs, and serving as a data developer multiple times. I also gained experience in Data Engineering, Python Data Engineering, and Data Migration projects.

Between February 2022 and September 2024, I had the opportunity to work on a data modeling project that was part of a global initiative for a large insurance organization in their Group Head Office in Germany. The goal was ambitious: streamline reporting for portfolio steering across 10+ operating entities (OEs). These entities were insurance organizations within the group, each based in a different country, with unique business practices and reporting KPIs. The challenge: enable Group Reporting in the Group Data Office for Portfolio Steering.

"The goal was ambitious: streamline reporting for portfolio steering across 10+ operating entities (OEs). These entities were insurance organizations within the group, each based in a different country..."

Muhammad Moiz Ahmed

Understanding the Client's Challenges

The first hurdle was the sheer diversity of the insurance business practices across OEs. Each region had its own ways of working and measuring success, which made it challenging to unify the data into a single, cohesive model. Adding to this was the need to adhere to the Data Vault 2.0 methodology and align with an overarching Enterprise Ontology, ensuring that every piece of data aligned with a consistent, standardized framework. All of this had to be anchored in a Group Business Glossary (GBG) that defined and standardized critical business data fields.

Leveraging Past Experiences

Thankfully, I came prepared. My experience with Data Vault 2.0 helped me easily understand the Group Enterprise Ontology of this large Insurance company; it encompassed various Insurance Domains. Working to enhance the Group Data Model, adding new scope, involved workshops and interviews with business subject matter experts (SMEs). These sessions were crucial in understanding the nuances of their business requirements, data needs, and reporting goals, ranging from the terminologies they used to their operational practices. I translated these business requirements into clear models, integrated them into the GBG, and incorporated them into the final deliverables: the data model and data standards.

One of the most impactful contributions I made was authoring a comprehensive Data Mapping Guideline. This document became a go-to resource for OEs, explaining the objectives, data granularity expectations, and relationships required for alignment. It also included practical examples to make the guidelines easier to follow, which significantly reduced misunderstandings during implementation. Additionally, I conducted various OE onboarding sessions, allowing representatives to ask questions and share feedback while addressing challenges they faced during implementation of the Data Model and the Data Standards.

Find a Data Modeling Expert on Malt

Unveiling Surprises Along the Way

No project goes without surprises. One memorable challenge involved how OEs managed address data. Instead of having a unified view, addresses at the OE’s end were scattered across various systems—policyholder addresses in Policy Management Systems, claim incident addresses in Claims Management systems, and invoicing addresses were all managed separately. When we proposed consolidating these into a single entity with unique identifiers, some OEs pushed back, arguing that it was impractical, especially in multilingual environments. In reality, it was difficult for them to extract and unify the addresses. 

To address their concerns, I went about proposing a future-ready solution in the next version release of the model: enriching the address data with latitude and longitude to serve as unique identifiers. While this approach required a long-term commitment, it helped ease immediate concerns and allowed us to move forward.

Another challenge came from the varying data infrastructures across OEs. While some used modern platforms like Snowflake, others relied on legacy systems, including mainframes. This disparity meant that key metrics like Gross Written Premium (GWP) were often inconsistently available. In some cases, proxies had to be used, requiring additional validation to ensure accuracy.

Navigating Diverse Views

The differences in how business groups viewed data added another layer of complexity. For example, the Group Data Office, where I was working, prioritized an underwriting perspective (UWY), while others focused on accounting metrics (ACY). This led to yet another set of sessions, aligning these views required detailed working, building examples, making comparisons and explanations, which allowed for consensus-building. These sessions were time-intensive, but also rewarding at the same time.

Integrating older 3NF models into the Data Vault framework also proved tricky. Some of these models were XML-based, with relationships that didn’t map cleanly into the new ontology. Here, my role extended beyond technical implementation—I had to guide the team through the intricacies of reconciling these differences while maintaining the integrity of the unified model.

"No project goes without surprises. One of the more memorable challenges was dealing with how OEs managed address data."

Muhammad Moiz Ahmed

Collaborating for Success

As the project grew, additional internal team members were brought on board. While this expanded capacity, it also introduced challenges, as these team members were more familiar with 3NF modeling than Data Vault. This required significant patience and collaboration on my part. I created detailed examples, documentation, and mapping validations to bring everyone up to speed and ensure consistency across the team. Despite these hurdles, the collaborative effort paid off in the end.

Delivering Value to the Client

By the conclusion of the project, we delivered a robust unified data model that:

1. Standardized Data Reporting: Consolidated data from diverse sources into a cohesive framework, making group reporting more reliable.

2. Enhanced Data Quality: Addressed discrepancies through deduplication and clustering, improving decision-making capabilities.

3. Improved Future-Readiness: Designed to adapt to evolving business needs and advanced analytics use cases.

4. Facilitated Compliance: Ensured alignment with GDPR and internal governance standards through clear definitions and controlled access.

Looking to the Future

This project reinforced my belief in the power of thoughtful data modeling to overcome even the most complex challenges. It also highlighted the importance of adaptability and effective communication when working with diverse stakeholders. Looking ahead, I’m excited about the possibility of tackling projects that:

- Advance ontology modeling and leverage knowledge graphs for richer insights.

- Solve cross-border data challenges for global organizations.

- Integrate seamlessly with AI systems to enable predictive analytics and automation.

Reflecting on this journey, I’m incredibly proud of the solutions we built and the collaborative spirit that drove the project’s success. If you’re grappling with similar challenges or seeking to future-proof your data architecture, let’s connect and create something extraordinary together.

"This project reinforced my belief in the power of thoughtful data modeling to overcome even the most complex challenges."

Muhammad Moiz Ahmed

Find a Data Modeling Expert on Malt
Muhammad Moiz Ahmed

Muhammad Moiz Ahmed

Project Data Manager, Data Vault 2.0 Modeler

Project Data Manager, Data Vault 2.0 Modeler