Upcoming Webinar

Better Data, Better Decisions: How AI and Automation Are Reshaping Risk and Pricing

March 31st at 10:00 AM ET

DAYS
HOURS
MINUTES
SECONDS
  Everest Group IDP
             PEAK Matrix® 2022  
Indico Named as Major Contender and Star Performer in Everest Group's PEAK Matrix® for Intelligent Document Processing (IDP)
Access the Report

BLOG

Tami Pantzke on risk analysis for new technology and change management in insurance

March 25, 2025 | Artificial Intelligence, Data Analytics, Data Science, Digital Transformation, Insurance, Insurance Claims, Insurance Underwriting, Intelligent Document Processing, Machine Learning

Back to Blog

The insurance industry is no stranger to complexity, especially when it comes to navigating the rapid pace of technological change. For companies to remain competitive, they must systematically evaluate new tools, plan implementations carefully, and adopt effective change management practices.

On a recent episode of the Unstructured Unlocked podcast, Tami Pantzke, a seasoned reinsurance executive with over 30 years of experience, shared invaluable insights on evaluating and implementing new technologies in insurance. Drawing on her time as Senior Vice President of Operations at Gallagher Re and her expertise in operations and technology transitions, Tami highlighted practical strategies that can help insurers manage change effectively while balancing risk and operational efficiency.

Here’s a closer look at the discussion and Tami’s approach to leading successful technology transformations.

Listen to the full podcast episode here: Unstructured Unlocked Podcast

 

Identifying the need for new technology

 

According to Tami, the process of integrating new technology typically begins with a clear understanding of the company’s growth plans and operational goals. Whether companies are scaling via new clients, mergers and acquisitions (M&A), or other strategies, technology often plays a critical role in enabling that growth.

“You start by looking at your five-year growth plan,” Tami explained. “Do we have the people, processes, and technology needed to scale further? If not, that sparks the conversation about whether changes are necessary.”

Cost-benefit analysis is also an essential first step. Tami described a process of evaluating various scenarios to determine the best path forward:

  1. Maintaining the legacy system: What are the risks and costs associated with staying as is? 
  2. Improving the current system: Can enhancements to the existing system meet future growth goals and regulatory requirements? 
  3. Adopting a new solution: What are the costs, benefits, and risks of migrating to a completely new system? 

This structured approach helps organizations assess their options and ensure decisions align with strategic objectives.

 

Balancing growth and resources with smart implementation

 

One prominent challenge lies in balancing operational resources during transitions. Implementing new tools may require reallocating full-time employees (FTEs) and adjusting staffing plans to ensure both immediate and long-term needs are met.

“For some technology, you may see it reduces FTE requirements,” Tami said. “But that doesn’t necessarily mean you release people. You have to think ahead and consider your growth plan. Maybe you’re a little heavy during the transition, but those resources will integrate back into operations once new clients or growth materialize.”

Tami emphasized the importance of proper staffing during dual-system entry periods (when both new and legacy systems are running simultaneously). “You don’t turn off your legacy system right away,” she noted. “You’re doing both entries until you are certain the transition to the new system is working as expected.”

 

The role of the 80/20 rule in system evaluation

 

When evaluating potential technology solutions, Tami encouraged organizations to use the 80/20 rule as a framework for decision-making. “You want to ensure any new system can handle at least 80% of the functions you need,” she explained. The focus then shifts to assessing the remaining 20% of functions, which often include the most complex or high-risk processes.

That 20% deserves special scrutiny. “Those are your most complicated, manual, and high-risk processes,” Tami said. “You need to understand whether the new system can handle that 20% better than your legacy solution or if additional risks are introduced.” This highlights the importance of gap analyses and process mapping to evaluate compatibility with existing workflows.

Related Content: Harnessing AI for better data: The key to competitive advantage, a conversation with Peter Mansfield

 

Navigating the complexities of change management

 

Change management can make or break a technology transition. Tami drew from her experience to outline practical steps for ensuring adoption and minimizing resistance:

  1. Engage key stakeholders early 

  “It’s critical to involve managers and employees who will be using the system,” Tami shared. “If I see managers sending their B or C teams to demonstrations, that’s a red flag. You want your best people involved so they can give informed feedback and serve as champions for adoption.”

  1. Pilot programs 

  Before full-scale rollout, pilot programs allow teams to test the solution, uncover any challenges, and refine processes. “We would include SMEs [subject-matter experts] and key end users in pilots to provide actionable feedback,” Tami said. This is also the stage to develop critical documentation and training materials to support broader implementation.

  1. Identify resistance and address it proactively 

  “Change is difficult,” Tami remarked. “Some employees are resistant and prefer the status quo. Early in the process, you need to identify where resistance might arise and plan accordingly.”

  1. Test dual-system capabilities 

  Running legacy and new systems in parallel allows teams to measure functional accuracy and efficiency. Tami explained how this dual-entry period provides a benchmark for performance metrics, such as task completion times, and helps build confidence before full adoption.

 

Pitfalls of poor planning and lessons learned

 

The conversation also shed light on lessons learned from past implementations that didn’t go as planned. One notable failure occurred when a project team spent months developing a new system, only to discover it couldn’t connect to their legacy technology. “The project had to be abandoned,” Tami recalled. “This was a costly lesson in the importance of early testing and integration validation.”

Another example highlighted the dangers of limited stakeholder involvement. A solution designed with minimal team input resulted in a 90/10 outcome, where only 10% of functions worked for users. “Lack of communication and engagement across the organization significantly reduces your success rate,” Tami emphasized.

Related Content: Transforming insurance claims: insights from Ian Thompson, Strategic Advisor and Former Zurich Insurance Executive

 

How smaller-scale implementations can move quicker

 

Not all technology implementations require large-scale transformation. Tami described how department-specific solutions and smaller proof-of-concept (POC) initiatives allow organizations to introduce new tools at a faster pace. However, even with smaller projects, certain practices remain essential:

  • Ask the right questions 

  “You can’t jump into something just because it’s exciting,” Tami said. “You need to clarify how the new solution aligns with existing workflows, overlaps with other tools, and impacts long-term strategy.”

  • Leverage user data to ensure success 

  Running user reports and measuring adoption rates can help identify gaps in rollout efforts. “We’ve seen situations where a new portal was launched with great enthusiasm, but usage data later revealed it wasn’t being adopted by clients,” Tami shared. Addressing these insights early avoids wasted resources.

  • Balance speed with due diligence 

  While smaller implementations allow for more agility, Tami cautioned that proper vetting and governance are still important. “You still need smart heads in the room asking tough questions,” she said.

 

Looking ahead at technology transitions in insurance

 

As technology continues to evolve, insurance companies must strike a balance between innovation and caution. Legacy systems often remain in place for five to seven years, creating challenges when integrating cutting-edge tools like AI. Organizational design, stakeholder engagement, and ongoing governance will play key roles in navigating this fast-changing landscape.

Tami concluded with a message of optimism tempered by practicality. “The pace of change is accelerating, but with the right planning and processes, companies can adapt successfully,” she said. “It’s about knowing where to invest and having the right people in place to see it through.”

To learn more and keep up with the latest trends in AI, data, and insurance, make sure to subscribe to the Unstructured Unlocked podcast on your favorite platforms, including:

Subscribe to our LinkedIn newsletter.

Register for our upcoming webinar: Better Data, Better Decisions: How AI and Automation Are Reshaping Risk and Pricing, March 31st at 10:00 AM ET

Frequently asked questions

  • How do insurance companies measure the success of a technology transition beyond operational efficiency? Beyond efficiency, companies assess success through cost savings, return on investment (ROI), revenue impact, customer experience improvements, regulatory compliance, and employee productivity. Post-implementation reviews, user feedback, and data trend analysis help refine the technology strategy over time.
  • How do companies handle cybersecurity risks when integrating new technology? They mitigate risks through vendor assessments, penetration testing, compliance with data protection regulations, multi-factor authentication, encryption, and real-time threat monitoring. Employee training on cybersecurity best practices is also essential, as human error is a common vulnerability.

  • What happens if a new technology fails after implementation? Companies conduct root cause analyses to determine whether adjustments can be made or if a rollback is necessary. Contingency plans may include reverting to the legacy system temporarily or implementing phased fixes. Strong change management ensures minimal disruption to customer service and regulatory compliance.
[addtoany]

Increase intake capacity. Drive top line revenue growth.

[addtoany]

Resources

Blog

Gain insights from experts in automation, data, machine learning, and digital transformation.

Unstructured Unlocked

Enterprise leaders discuss how to unlock value from unstructured data.

YouTube Channel

Check out our YouTube channel to see clips from our podcast and more.
Subscribe to our blog

Get our best content on intelligent automation sent to your inbox weekly!