The insurance industry sits on mountains of data, but most carriers struggle to turn that information into competitive advantage. Louis DiModugno, Global Chief Data Officer at Verisk, recently shared insights on how the industry can move from data chaos to clarity through better governance, AI integration, and strategic partnerships.
Drawing from his extensive experience on both sides of the insurance ecosystem, DiModugno reveals how data governance has evolved beyond traditional frameworks to become a strategic differentiator. His perspective offers a roadmap for insurance professionals navigating the complex landscape of modern data management.
Listen to the full podcast episode here
The New Reality of Insurance Data Governance
Data governance in insurance has transformed from a compliance checkbox into a strategic imperative. DiModugno’s approach at Verisk goes “beyond the standard DMA framework that’s been out there for years” by implementing what he calls “data observability.”
This comprehensive approach encompasses four critical dimensions:
- Data quality monitoring across all ingestion points
- Data flow visibility to understand information movement
- Infrastructure governance in partnership with technology teams
- Usage governance to ensure contractual compliance with data partners
The stakes couldn’t be higher. As DiModugno explains, “The better my data is, the more confidence I have in any model or product that I have as an output from it.” In an industry where accuracy directly impacts profitability and regulatory compliance, this connection between data quality and business outcomes is fundamental.
Breaking Down Silos: The Integration Challenge
One of the most persistent challenges in insurance data management remains the historical separation between underwriting and claims data. These silos developed over decades of legacy system implementations, but they create significant blind spots for carriers trying to understand their full risk exposure.
DiModugno emphasizes that successful data integration depends on entity resolution capabilities. “The challenge with any bringing together of data sets is how are you going to match them?” he notes. While 70-80% match rates might have been acceptable years ago, modern expectations demand accuracy in the 90th percentile.
The mathematical reality is sobering. When combining multiple data sources, confidence levels multiply rather than add. Three data sets, each with 90% confidence, result in an overall confidence of just 72.9% (0.9 × 0.9 × 0.9). This multiplicative effect underscores why achieving the highest possible data quality at each source becomes critical.
AI and Generative AI: Amplifying Both Success and Failure
The integration of AI and generative AI into insurance operations presents a double-edged opportunity. These technologies can process data faster and at greater scale than ever before, but they also amplify existing data quality issues.
DiModugno’s team has found success using retrieval-augmented generation (RAG) architectures for internal operations. For example, when legal questions arise about carrier contracts, their AI system can quickly search through documentation and return answers with specific reference points. This approach has “gathered some significant efficiency opportunities” while maintaining transparency about information sources.
The key insight is that AI doesn’t eliminate the need for data governance—it makes governance more critical. Poor quality data fed into AI systems produces poor quality outputs at scale. As DiModugno puts it, “The more confidence I have in the output of those models” depends entirely on the underlying data quality.
Related content: Douglas Loots on how AI and third-party data are transforming insurance
The Standards Battle: Moving Toward Consistency
One of the advantages Verisk enjoys as a data aggregator is its established relationships with policy administration system providers like Guidewire, EIS, and Duck Creek. These partnerships help create consistency in data formats and quality, addressing what could otherwise become a “tower of Babel” situation.
The alternative—receiving whatever format each carrier decides to send—leads to significant data quality challenges. DiModugno recalls experiences from the reinsurance space where data might arrive in unstructured formats or even handwritten on paper. The industry has come a long way, but standardization efforts remain crucial for scaling data operations effectively.
Data Observability: The Next Evolution
Traditional data governance focused primarily on policies, procedures, and compliance frameworks. DiModugno’s concept of data observability adds operational intelligence to these foundations:
Cost Management: Understanding the financial investment required for data quality initiatives and measuring returns on that investment.
Access Controls: Implementing granular controls that track who accesses data, what they access it for, and how frequently they use it.
Usage Compliance: Ensuring that data use aligns with contractual obligations, particularly important when handling data from multiple carrier partners.
Quality Monitoring: Continuously measuring and improving data quality across all systems and processes.
This holistic approach treats data as a strategic asset requiring active management rather than a byproduct of business operations.
The Regulatory Landscape and AI
The regulatory environment for insurance continues to evolve, with over 50 different jurisdictions in the United States alone creating compliance complexity. DiModugno sees generative AI as particularly valuable for navigating this landscape, enabling teams to quickly understand regulatory changes and differences between states.
The efficiency gains are significant. Instead of having legal professionals spend hours combing through regulatory documents, AI systems can provide answers with reference points in minutes. This capability becomes increasingly important as regulatory frameworks adapt to address AI usage in insurance operations.
Future-Proofing Through Advanced Technologies
Looking ahead, DiModugno anticipates several technological developments that will reshape insurance data management:
Enhanced Segmentation: Larger data volumes will enable more granular risk segmentation while maintaining statistical confidence. Previously, small segments lacked sufficient data points for reliable modeling. Now, massive data sets support deeper segmentation with maintained accuracy.
Graph and Vector Databases: These technologies will reveal relationships and connectivity across data fields that traditional relational databases cannot capture, providing new insights into risk patterns and customer behavior.
Quantum Computing Impact: While promising enhanced capabilities, quantum computing also threatens current encryption models, potentially requiring complete rethinking of data security approaches.
Related content: Building Enterprise-Scale Generative AI in Insurance
Practical Steps for Insurance Professionals
Based on DiModugno’s insights, insurance professionals should prioritize several key actions:
Invest in Entity Resolution: Focus on achieving match rates above 90% when combining data sources. This foundation enables everything else.
Implement Data Observability: Move beyond basic data governance to comprehensive monitoring of data flows, usage, and costs.
Partner Strategically: Work with technology vendors and data providers who understand insurance-specific requirements and maintain consistent standards.
Plan for AI Integration: Develop data quality standards that can support AI implementations, recognizing that AI amplifies both good and bad data characteristics.
Address Silos Systematically: Create technical and organizational approaches to break down barriers between underwriting, claims, and other data sources.
The Democratization Opportunity
Perhaps most importantly, DiModugno envisions data and AI democratizing access to insurance products. Many consumers lack access to beneficial products like life insurance and annuities simply because they don’t understand what’s available or can’t access appropriate guidance.
AI-powered education and recommendation systems could bridge this gap, helping consumers understand their options and make informed decisions. This democratization represents both a significant business opportunity and a chance to improve financial security for millions of people currently underserved by the industry.
Conclusion: From Compliance to Competitive Advantage
The transformation DiModugno describes moves data governance from a compliance necessity to a competitive differentiator. Organizations that master data observability, implement effective AI integration, and maintain high-quality data standards will gain significant advantages in pricing accuracy, operational efficiency, and customer service.
The insurance industry stands at an inflection point. Carriers can continue treating data as a necessary evil, or they can embrace it as a strategic asset. Those choosing the latter path, guided by principles like DiModugno outlines, will be best positioned for success in an increasingly data-driven marketplace.
The journey from data chaos to clarity requires investment, commitment, and strategic thinking. But for insurance professionals willing to make that commitment, the rewards—in terms of both business results and industry impact—are substantial.
Subscribe to the Unstructured Unlocked podcast to get the latest episodes on your favorite platforms, including: