Upcoming Webinar Revolutionizing underwriting clearance: a groundbreaking AI solution unveiled
November 12 at 8 AM PT   |   11 AM ET  
Save Your Spot
0
Days
0
Hours
0
Minutes
0
Seconds
  Everest Group IDP
             PEAK Matrix® 2022  
Indico Named as Major Contender and Star Performer in Everest Group's PEAK Matrix® for Intelligent Document Processing (IDP)
Access the Report

Unstructured Unlocked episode 20 with Brandi Corbello

Watch Christopher M. Wells, Ph. D., Indico VP of Research and Development, and Michelle Gouveia, VP at Sandbox Insurtech Ventures, in episode 20 of Unstructured Unlocked with guest Brandi Corbello, Indico SVP of Global Delivery.

Listen to the full podcast here: Unstructured Unlocked episode 20 with Brandi Corbello

 

Christopher Wells:

Hi, welcome to another episode of Unstructured Unlocked. I am co-host Chris Wells, and today I’m joined by co-host Michelle Govea. Michelle hi. Hello. And we are both pleased to be inviting Brandi Corbello here who is SVP of Global Delivery at our very own Indico data. Brandy, thanks for being here.

Brandi Corbello:

Yeah, I’m excited. Thanks for having me.

CW:

Yeah, we’ll, we’ll see how excited you are when we’re done. <Laugh>. Let’s give it a shot. Why don’t you, why don’t you tell everyone a little bit about your background, how you ended up where you are and what it is that you do now?

BC:

Yeah, definitely. So background came from management consulting, really like around the transformation space for large corporations or enterprises. From there I actually took on a transformation role at Cushman and Wakefield where I actually led up standing up their transformation office, which was really exciting and really was consumed of things like, and machine learning analytics as well as like org redesign and change management. And then I ended up at indico data how I was actually a customer. So I was really excited to see really like the trifecta of you, if you will, of like what comes together with transformation. Cuz there’s consulting implementation, there’s like being on like the enterprise side, but then there’s also products that like enable all of those things. So for me it was just like rounding myself out a little bit to start and I believed in the product. What I do here is I really focus on our customer journey. So think about the point in time where you’re deep in evaluation of our product or a product. So really thinking about a proof of concept and like purchasing, like Indico, that’s really where I start, right? So you’re deep in that evaluation. And then our professional services team, which helps with implementation, works with our partners on enablement, our customers, and then customer success. So really like through your lifetime, like with Indico.

CW:

Great. And that sets up really nicely the topic for today, which is essentially all things metrics around solutions like ours, but also more generally just as we’re thinking about large companies like insurers, onboarding new platforms, new processes I’d love to hear about what you’ve learned in doing all of that delivery. Like what is that customer journey when you’re onboarding a new tool or platform?

BC:

Yeah, I think, you know, like taking my indico hat off, right? And like putting like my former like customer hat on, it’s really important to understand like first like what problem are we solving and does that align to the strategic objectives of our organization, right? So like, or are we just solving like an acute problem, right? Because like if that’s the case, then you might wanna like think about it a little bit more broadly. So I think, you know, I always encourage people to take a step back, what problem are we solving and how does that line up with the strategic objectives of our corporation, our enterprise, right? And then you say, okay, what, what exactly would happen if I solved that problem, right? Like, what impact could it have if I was able to get rid of that problem? And then you start saying, okay, well it looks like I really need something that could do X, Y, Z thing, right? So like, whatever that thing is, then you would start going out to the market to say, well, what technology’s out there, right? And it’s really important to say like, define what success looks like when you’re starting to evaluate those technologies versus just scanning the market. Like, what do I need? Like what are my actual requirements to solve this problem? And how am I going to evaluate that from like a success criteria perspective?

CW:

So, can I, before we, yeah, it, it does, I wanna pull a thread before we jump into the definition of ex success and how you measure things. I’ve bought a number of different software tools in a number of different contexts. Talk to me about going out to the market, cuz that’s, that’s a tricky thing in b2b. By software, by

BC:

That’s totally, you know, I think like like I think about my consulting hat as well as my customer hat, like prior to being at indico and I think a lot of times, like we hear like there’s this like hot new technology out there, right? And it’s like, oh, I must go see this. And then you’re like really trying to like force a technology versus like really understanding like, what is it that I like actually need, right? I don’t know if that answers your question.

CW:

It it does, it answers it at a, at a good, I would say like the 30,000 foot view and like yeah, depending what it is you’re trying to buy, you know, build the right shopping list before you go to the grocery store. Definitely. How do you even, where do you go to source like these are the potential, potential solutions and like sift through them.

BC:

So I think a lot of times, like one, like I always went out to analyst reports to start, right? So like, Gartner was like one I always followed. So I always went out to analysts to say like, forever Everest. There’s like many out there, but I typically went out to analysts to see like, hey, what’s out there knowing like what I needed? I think you always have to know what you need because they have reports across like a breadth of technologies and they’re very like specific, right? So like categories. So usually I go out to analysts, typically I would talk to peers as well or I would talk to like any like si that we had in-house at the time to help like have them help me understand like what’s out there or what type of tools could help me solve this or help us solve this in order to like then go scan the market appropriately. Yeah.

CW:

Yeah. That’s, so that’s

Michelle Gouveia:

Actually that answer is, is really interesting. Brandon talks a little bit about what, what we do at Sandbox as part of our day-to-day job of there, there’s two that we source companies, meaning go out and find companies that we may want to pursue an investment in. It’s what is the technology that that you have, right? And is it a technology that is adaptable to, in, in our area focused the insurance industry, right? And as a result, what can that technology deliver to our partners or our investors, which are, you know, incumbents in the insurance industry. Alternatively, it’s, we talk to our investors and they say, we have this pain point that we are trying to solve. This is what we’ve looked at, this is what they can do, what, where they fall short. We are really looking for this capability. And then that in turn is how, how we frame some of our questions or some of the, the research that that you just talked about, where you go out to analyst reports, we, we go out to conferences, we look at different reports, different hat lists of, you know, top 100 companies or top AI capability, things like that.

And look for companies that we think fit the criteria and then really do a deep dive to say what is the business outcome that you help your customers achieve? And then, you know, you keep working through, through those questions and those different iterations of in meetings to say that this company can is responsive to what I’m trying to do. Now how do we go and pilot that or implement that and then measure the success of that on the backend. So I think it’s very similar to the, the B2B sales side and kind of the, let’s go find something to invest in. That makes sense. Yeah. You

BC:

All, you know, it’s like interesting, I think a lot of people they don’t get to see like the VC space or like the investment side, right? So like if you all think about like business outcome, like are there certain like, like thresholds that you all set, like we’re good with these types of outcomes, but if they don’t meet like our criteria, then we have to walk away. Like how do you all think about that?

MG:

Well, so when you’re talking about what is the business outcome that the company delivers for their customer, really what, what we care about is that they’re successful in delivering what they promise to their customer, right? That’s the metric that, that we care about is is there opportunity for growth, right? Like, and, and we could, we need to a whole deep dive on like VC space and all the things we consider when we’re looking at an investment. But I think what you’re getting at is I, this is broad general relation and prob people are probably gonna like come from you. Like that’s, that’s not an accurate statement, but we, it’s less about what is the business outcome that you drive, that we care about. We care more about the fact that you drive the business outcome that the customer or our investor is seeking, right? Mm-Hmm. <affirmative> like, so cuz we invest in a variety of different companies. Some are in the automation space, some are in the data space, some are in the product distribution or product development space. So it’s, it’s different things that our investors and the industry is looking for. And so we really care about are you responsive to what we’re seeing the industry needs regardless of what that outcome would be.

BC:

Yeah. Do you all think about that? So, you know, like, just like thinking about like our customer journey rate, it’s all around like they, like we talk about like land and expand, right? And also for our customers, it’s the same thing, right? They’re able to like land the technology and they’re able to like expand it across the organization and see like adoption, right? So like is that something you all think about as well? It’s not just about like acquiring the customer, but it’s also ensuring that that customer continues to be happy and is like sticky like with you.

MG:

Absolutely. Great. Absolutely. Yeah. Awesome.

CW:

All right. I think that’s a good segue back to the point at which I derailed Brandy’s train of thought. So you were talking about identifying the business outcomes you’re looking for, and made me wonder how do you go from like a very high level outcome? Like, you know, we want to, we want to improve the speed of quoting to binding insurance to like an early indicator of, okay, we’ve had this thing installed for three months. Is it doing what we think it’s doing? Like how do you, how do you, you know, what are the sort of stages along the way of getting to that, measuring that high level outcome?

BC:

Yeah, I mean, I think, you know, you know, like advice I always like give people is one, like when you’re evaluating like technology, you have to have a high level business case. So to your point, well, it was like you’re really at that 30,000 foot and it’s like, I think that it can drive this, right? Like you’re making an a hypothesis and you’re going with those assumptions, right? Because you can show those assumptions based on like actual data that’s flowing through your ecosystem. But what’s really important is now that you’ve implemented it, how do you think about like capturing that value, right? So like you made a hypothesis early on, which is what influenced you to make that technology purchasing decision. Now how do you capture that value? I think one thing that you usually see that’s missed is like baselining those metrics before you even implement.

So you actually have to baseline like, what are we going to measure and how are we going to measure it? And you have to do that before you start implementing the technology or anything that’s like net new, even if it’s just an org change, right? Because then how will you know, like what you actually did. So I think that’s always an important part. And then you implement, you now have your baseline metrics, you know how you’re gonna measure yourself. So then you can always go back to say, did we see the change that we thought we would? Yes or no? Right? And sometimes your hypothesis could be a little too exciting, right? And you might say, but we still were able to capture significant amount of value and we still have a healthy business case and we’ll continue to capture value. Now once that’s in place, and you know, like you’re in production and you’re like, you’re really running smooth, right?

Then the question starts to become, well what, what are we not getting to that we could be getting to, right? Because what you do on day one is like what you were doing on day zero, but you’ve just made it better. So now it’s are we able to get to more than we were able to previously? And if the answer is yes, then you start really like working those things and start thinking about what else could I do from an operational standpoint, but also what else could I do from like a value to the organization, right? Could I generate revenue? Could I create like products embedded in my services? If so, what does that look like and how do I unlock that?

CW:

Interesting. The the physicist in me has to ask <laugh>, since a large enterprise organization is pretty much the opposite of a clean laboratory how do you sort out like, okay, you’ve got your baseline, but now I’m running this, how do you sort out things like, oh, it was just, you know, a change in the market happened to coincide when we turn this thing on. And that’s, you know, that’s changing the metric. Like how do you have those conversations, especially with the business leaders who are asking the question?

BC:

Well, I think that’s where the, like I think a lot of times people are really focused on like a hard metric, right? They’re focused on like total revenue dollars or total cost dollars, right? Versus like really talking about, well, like what scale or velocity have we created with this thing? Meaning like even if the market has a downturn, are we seeing higher margins? If that’s the case, then, you know, you’ve put something in place that can scale or flex, like depending on what’s going on with the market or any impact to your business.

CW:

Yeah. So I, I think what you’re saying is start with the high level, but you really need to drill down deep to understand like how this new widget connects with the business process and the rest of the environment. Like you have to think more carefully than just the number.

BC:

Yeah, exactly.

MG:

Brandy, how, how often or, or has it happened where you, you go into an engagement and that perspective customer says like, th this is the output we want. And and in your experience and with, with maybe other customers that have tried to implement something similar, do, are there any points where you go, we can do that, but that’s not really the metric you should be using or that you should care about? Like really this is what we can help you drive? Like how do those conversations go?

BC:

Yeah, I think that probably happens 90% of the time, right? People come in like, and I’ll use like indico or AI is AI in general, right? Is actually really good. People will come in and they’ll say accuracy, like that’s like the measure or the metric that they care about, right? And it’s like, well, hold on a second. Like, like let’s talk about accuracy, right? Cause like what does accuracy mean to you? Because accuracy means different things. Like if I ask you like what rate of error you are comfortable with Michelle, you might say like, I’m okay with 2% walls might say I’m okay with 5% and I might say I’m okay with 10%, right? But we all think that those things are accurate. So like, I think like that’s always the one that AI in general, not just at Indico is like the one to talk about.

Because accuracy is really like a definition, like in the eyes of the beholder, right? You’re only as accurate as you want to be and you’re only as accurate as like you’re teaching like the machine learning models to be, right? And so I think for us it’s usually like really, like when we hear the word accuracy, it’s a, it’s fine cause that’s what the market always says, but we really then start saying like, well what does that mean to you? Right? When you say accuracy, like what does that really mean? And then we start saying, but what business objectives are you trying to influence with accuracy, right? Because just being accurate with data, like what does that does, does that drive revenue? Does that drive like higher margins? Like what are you trying to influence with accuracy? Because then let’s talk about that.

MG:

And that’s really interesting too because I think we, we’ve talked on previous episodes on how, how accuracy may, may have incremental improvements, right? Like you might have, depending on how, to your point, how you define it, bringing, bringing in new data or automating some level may, may reduce your error rate, right? From, you know, 50% you might have now 70% accuracy. But then you say, okay, now with the 70%, let’s take a look at the data that’s coming in. Is there a data point that is skewing this accuracy? And if we were to remove it and retrain the model, do we now have an an improvement in that? And to your point, what does that drive downstream in terms of automating, you know, straight through processing from an underwriting standpoint, the decision making, what gets kicked out versus what, what continues through, you know, the backend systems? I think, I think so you’re just hitting on something that we, we’ve chatted on before, but I think it’s it’s a really great point.

BC:

Yeah. And it’s, you know, it’s like I think we all get it right. We all want our data to be accurate. Cause it’s influence decision making, it’s gonna influence the business, influence the direction, it’s gonna influence the strategy, right? Cause I think a lot of leaders are now like really leaning into their data to say like, strategically, where should I be going? And like, what’s the data telling me? So of course you want it to be accurate. I think the biggest thing for us that we focus on with our customers now is, you know, the people who were your preparers are now your reviewers. So like you’re gonna be just as accurate if not more accurate than you were before because you now have like double the review and you’ve like really been able to like capture like data like more efficiently, right?

CW:

Can we just talk about how almost no one knows how accurate they were before?

BC:

Yeah, no, it’s so true. We we’re having this conversation with a few people right now, right? We’re saying like, but what was your error rate before? Right? Yeah.

MG:

And then crickets, like,

BC:

It’s really tough though. And like you can’t blame them, right? They’re, they’re asking the right questions. I think it’s just more like, well as you and I have talked about this, we are harder like on like AI technologies than we are on people. Absolutely. Like when I was a customer, my favorite thing and like I will probably be called out for this as well, my favorite thing was when we were getting audited, like for like having like ml like embedded in like on top of our documents. The question I got asked by our auditors was, well how do you know it can read? Right? And my question back was like, well how do you know that? Like people can read, right? Right. Cause like, like do you do like the literacy test every year to make sure that they have like adequate like reading levels?

Like, you know what I mean? Like what, how do you know that people can read appropriately? And like, this is another thing nobody talks about is if you hire somebody to be on your accounting team or whatever team today, if you hire them today and you onboard them, you onboard them in your production environment, you’re like, Hey, here’s what you do now go do it. Right? So it’s like there’s this like, there’s like this like stigma like against like AI and like machine learning that I feel like we’re still like working around. It’s like, well how do you know I can do that? How do you know it’s accurate? Like same question Beth. Right? And that’s why typically when you ask a question like, well what’s your baseline? They’re like, well, we don’t know. We only like, we only quality checked 20% of our operations to date. And I’m like, okay, well in the 20% that you quality checked, like how accurate was that? Right? And I think that’s like, I think that’s like the inflection point that we’re at in the market and people, I think people are try starting to understand it more, right? Which is like, okay, what you’re saying is I can be just as accurate as I am today, if not better. Right? And I think that’s like the best way to start describing it.

MG:

Yeah, no, I agree. I I had never thought about it from that, that lens brandy of being like when you hire someone new, they’re straight into the production environment, like real repercussions for, for mistakes as opposed to test environment and two or three or four versions of that, of that testing before it gets implemented into production. Yeah. it’s, I mean, and I’ve been in a role like that, right? Where I’ve had to be trained to to, to do that work where my work got, got QA and yeah, that’s there, there’s, there was always not in my work obviously, but there’s always errors <laugh>,

BC:

There’s always errors. Like, and the thing is, is like people make errors, right? Technology makes errors, right? We’re all just as good as like we have been taught to be or that we like are are okay with guess is like the best way to describe that,

CW:

My cynical thought on this is that people are much less satisfied by yelling at a bot than they are

MG:

<Laugh>.

BC:

I do think it’s tough, right? Like it’s definitely a transition in the workplace and yeah, I I definitely get it. I think just like Michelle to your question, like these are the conversations we have with our customers, right? Like this is the, this is what you’re trying to go after, but really like what are you trying to drive and like how, like are you thinking about it the right way? And our biggest thing is like, how do we help you like educate, like the broader organization or decision makers in like this, like transformation or the change process that you’re going through today. Because it’s, it’s not just accuracy.

CW:

Yeah. Yeah. I’ve said it all to many too. All too many, too all together, too many times on this podcast, which is that accuracy doesn’t have any units and therefore is not Yeah. Quantity. You can’t turn it into dollars or time.

BC:

I think we’re almost there. I think we’re getting there.

CW:

Yeah. So, but to, but to your earlier point, the way that the accuracy gets turned into dollars or time for your organization is specific to your organization and what you do. And so you have to be very thoughtful about crafting those success criteria.

BC:

Yeah. And like, again, like what, what else can you light up that you weren’t having access to prior to this?

CW:

So let’s

MG:

And Brandy, when Oh, go ahead. Sorry Chris. I was just say when, when you, when you think about that is it, is the approach usually that you go in trying to prove out one metric and then based on the results of that, then you can expand into others or various use cases bring you different metrics that you care about? Or is is the goal to prove as as much positive outcome as you can? And so you, you wanna try and you kind of succeed in as many, this is a terribly worded question. Like, like are you measur, how many things are you going into measure at a time to prove success in a pilot is I guess what what I’m trying to ask, like, is it one thing, is it one thing for, for the customer but you’re trying to, you know, simultaneously poop two, two or three things so that it’s a better story?

BC:

Yeah. Typically for our customers, like we always like recommend like a pilot or an MVP to start, which is like, what’s the bare minimum that gets like you comfortable with like production, right? Meaning like, like submission use case. Broker submissions is a really good example here. Like what are the bare minimum amount of fields you need in order to like clear that submission, meaning like you’re ready to quote it. So like what do you need to clear it? Like how many fields, right? Typically we see 20 to 30, it depends on like, this is like everybody’s a little bit different, right? So we’ll say 20 to 30 fields, that’s it, right? Because that’s gonna like, at a minimum, you’ll be able to go to production, you’ll be able to clear submissions as they come in. Then you say, okay, now, now that you’re in production you can like clearly submissions, how many more fields does it take to actually like quote those submissions?

Like what else do you need in order to quote that business? And then it’s like typically another like 50 to 70 fields. So then you add those fields and you say, okay, now you can clear and quote anything that comes through from a submission standpoint. Then the next question that comes into play is that to me is like all required fields in like the operational process of broker submissions, but there is a ton of rich data sitting in those submissions, right? So the other thing too that we’ve seen with our insurance customers is if they didn’t clear a submission or they rejected a submission, they kind of throw away that data. But that data is actually really valuable, right? Because you basically like rejected a submission or you didn’t clear a submission that has a ton of market data in it that could tell you like what’s going on on in a certain line of business, like what’s going on geographically, right?

And they could influence different decisions or you could start thinking about like ways to create products that are embedded in your services. Meaning like, hey, I’m seeing like this trend like here, right? In this like particular like lion of business geographically. Like what can that tell me? Or is am I rejecting a lot of business because I don’t actually have that type of l o B stood up yet, right? And it’s like, well how much am I seeing? Like I think cyber’s a really good example here. Like a lot of large in insurers or carriers were rejecting cyber and then they were like, it had, they had the data like earlier on they would’ve said like, oh wow, there’s a lot of cyber coming in. Maybe I should think about investing in that line of business, right? Because like I’m seeing a trend up, but they were a little bit too late because they didn’t have that data that would suggest to them there was something going in the market where you might want to underwrite like cyber as a risk.

MG:

This should, this goes back to we, we talk about it a lot of the insurance industry and is sitting on so much data and it’s their inability to access it to then to your point, brandy activate that data to make their business decisions. Like to, to go after a new geography, to go after a new line of business or to your point to reduce their exposure in a line of business if, you know, depending on, on how, on how their decision you know, workflow is, is going and, and what they wanna refine. I

BC:

Yeah. Could you imagine like if like a carrier could say to you in a second like, oh, all of our marine risk is like sitting in like one geo, then they might be like, that’s probably not good, right? Like, we probably need a Marine a little bit more like spread out. So, but like I think right now, like it’s really hard for them to do that because they split out by L O B and Geo. So I think there’s like, there’s a lot they could be doing just from that standpoint, but that’s even just still like, just capturing the required fields to like quote the business. But what about all the other data, right? That’s like sitting in these submissions. Like think, think about if you could capture like 200, 300 fields and like all those fields could like tell you something or they could create something in the market that that would disrupt. I think that’s, those are the things we start like encouraging our customers on.

CW:

We have implicitly talked about three different types of metrics at least, and I wanna make them explicit. So we’ve talked about cost, we’ve talked about you know, creating capacity, so upside. And then we’ve also talked implicitly about risk. What do you see particularly in, I’m actually interested in this question for both of you cause I don’t know what are, what are folks in the insurance industry prioritizing right now? And in what order do they start thinking about the other metrics if they ever get there?

BC:

Yeah, I can start cuz Michelle probably has like a broader view of the market than I do. I would say, you know, like revenue is also gross premiums written is also another one that’s really important in the insurance space. Loss ratios are really important. I hear about them all the time right now. But I think, you know, like you think about like what we’re talking about here is like operational efficiencies to drive what operational efficiencies to drive scale. So most insurance carriers, if you talk to them, they, they can’t even get to every submission that’s hitting their inbox today to clear or know if they should quote or not quote it, right? So like, what does that mean? It means it’s really like a first in, first out, or a, this is the first one I saw first out situation, right?

So really what they’re doing is they’re, they’re just looking at the things they can get to versus in a world where you have like technology enabling like those processes, you could actually like look at every submission that hits that inbox and know if you should respond or not. So what does that mean? Well, that means you could clear more business, you could quote more business. You could find more business, which means like your gross written premiums go up and your loss ratios should go down because again, like you have your preparers turning into reviewers on that data to ensure that you’re, you’re writing the right risk. So that’s like, that’s the revenue driver I would say for most of these carriers that we’re working with, to your question of like, where are they starting? Like what’s going on in the market? What I’m seeing is like the underwriting work bench is like top of mind right now for most of the carriers we work with.

So it’s not just how do I intake this data and really like ingest it in like an efficient way as quickly as possible so that way I can respond. Because usually the carrier that responds first is the one that wins, is what we’ve been told. Meaning like they’re the one that, they’re the ones that end up binding the business. So I think, you know, for them it’s like, how can I ingest it quickly and how can I respond quickly? So those are like, and how can I get to all of it, right? So like those are like the biggest things, but then there’s like a decision engine that sits under it. So what what I’m seeing with most of our customers is they’re really investing heavily into underwriting workbenches to help drive those decisions. So you’ve got a tool that’s taking it in, you’ve got a tool that’s helping like drive the decisions to, to then really like fulfill like that, that side of the house.

The second area that I’m seeing most of our insurance customers go into is then like the claim side. So like talking about F N O L and like what does that do? Well, it makes your customers sticky, right? So like you wanna talk about, then you’re coming back to underwriting, right? Like that business has to renew and like if you weren’t like on top of it when it comes to like processing their claims, do you think they’re really going to wanna renew with you? Like the answer is no. Right? So now that’s like the second phase that I’m starting to see is really around claims. And then like we talk a lot about policy service thing now is like really the third, like kind of like t trache. You’re seeing them go after, tell Michelle what you’re seeing. But that’s, that’s really what I’m seeing right now with our insurance customers.

MG:

Yeah, Bri, I I think you’re spot on. I think you hit on the claims pieces, which is what I wanted to hit on too. That, that first in first out or that triaging the claim as it comes in is an area of focus two to say if you do an analysis to say what, what is missing from this? What do we have to go back out to do? Is this claim similar to others? Can we, are we, do we know we’re just gonna pay it? Like, can we automate that process because it’s, you know, something that we, we always pay out? Does it need an adjuster to really take a look at it? Is it something that we need to look at sooner because it’s more complex and therefore we’ll take more time to, you know, to, to adjudicate it through the process So that from the claim side, I think depending on you were spot on with you know, the the intake side.

I think that also, this is Chris’s favorite thing to hear me say. It depends on, on the product and the line of business because you know, the goal for automating an intake process from, you know, if you’re writing in the middle market commercial space is how, how do I automate what’s in the email into, into my system versus something a little bit more straight through or streamlined or like a personal line submission or even a, a small commercial insurance submission of here’s all the data, now how do I just get that into the system without someone having to manually enter that or you know, how do I supplement that with data that double checks or validates that all of the information on that submission is accurate to then enable that straight through processing that they’re all trying to, you know, to, to get through so that they can spend time doing the more complex a analysis and assessment on some of those more difficult risks.

 And I think too, depending on, on where you sit in either of those workflows or those line of business, th that will drive the metric that you care about and the business outcome that you care about, right? I, I think, I think it’s, it’s time. So the reduction in time that it takes to assess what’s coming in, but then also how, how do people spend their time? So are they spending their time? Can we reduce the manual process of this fr you know, from 90% manual entry to be 40% manual entry? And the rest is, is an analytics work or analysis that’s being done to determine if, if we wanna write this risk. So we’re, I think we’re seeing a lot of the same things. We we’re just seeing it probably from a broader variety of you know, carriers and geographies in terms of how they all, how they all think about it differently.

Yeah. Cause they all have different channels too, right? Like we, we’ve talked about this digital channels where that information’s coming in the email, the the broker, the agent portal that’s connected in to the carrier via SM APIs. You know, you, you can, there’s a lot of different ways that that information’s coming in and then being used internally. I think too, sorry, now I’m just ranting. But you made a great point about the renewal process. And I think what’s really interesting too is that the point you made earlier about that data that comes in that gets rejected is just sitting there and it’s lost. There’s no like view to say from an underwriter usually to say, oh, this risk came in last year and we didn’t and we rejected it. And so you’re usually, that’s an added cost because you, you don’t, you don’t have that, that view to say, we, we’ve already looked at this, you’re probably re you know, doing another data call on that same risk that you did a year ago to potentially get to the same outcome. And so now you’ve wasted dollars in terms of, of data acquisition and data validation, you’ve wasted dollars and, and time in terms of effort by an underwriter to look at that potentially for something that was never gonna get underwritten again. So having that, that feedback loop would be huge in reduction of, of time and effort.

BC:

Yeah, no, totally. I totally agree. I wanna pick on personal lines for a minute with you, Michelle. So what I’m hearing is like really two themes. So I wanna see if you’re hearing the same and what your thoughts are on it. So the first theme is all around the contact center. So what I’ve heard is, or like customer service, what I’ve heard is the contact center basically went from phones to chat, but nothing’s really changed. So like that’s one theme I’m hearing. And then the other theme I’m hearing, you know, I’ve asked many people in the personal lines space, I’m like, like what’s your dream? Like, if you could just like map out, like, this would be my dream scenario for personal lines carrier, what would it be? And what I hear is we would like to be proactive versus reactive. Meaning we already know that our customer has suffered a loss and we’re already processing their claims before they tell us. So I wonder one, if you’re hearing those themes and two, like in the market that like would solve that or any, any sort of noise around those two themes or if you’re seeing anything different.

MG:

Definitely hearing a lot of, of interest in, in both of those categories, they are two very different challenges or things to solve for. I think on the, the call center side, absolutely heard a desire for a lot of capabilities and solutions to come in. I agree with you that calling the call center is less so about the like, interaction on the phone. And it’s a lot of how much can we triage up front via, via a chatbot or ai. The challenge being that you’re really still just getting the level one type of answers and responses and it’s really just like someone said the other day, it’s like when, when you’re calling, if you just have a question that you, you know, a person could answer in, in, in five seconds, but you have to go through three rounds of a chatbot to say, no, you still haven’t answered my question or you still haven’t answered my question, now I wanna talk to somebody.

That’s even more frustrating than waiting to talk to somebody because you’ve now wasted three or four cycles of of that. And so from a call center capability, how do you, how do you quickly identify what the challenge is? How do you, how do you get a response that’s more informative than just a, like I’ve googled this answer, right? Mm-Hmm. <affirmative> and, and it’s here. And so what are the capabilities that we’re seeing are things that h how do you take that information in? How do you pull in, like pull out the right information so that you can generate the, the right responses. And then it’s some of those metrics we’ve talked about, time to response, accuracy of response and then some of those, you know, p post-call or post conversation surveys. How do you collect that data to then bring that back in and improve the overall customer experience?

 On the, on the, let’s be proactive, not reactive side. There’s a lot of of noise around that and it talks to all of these solutions that you’ve seen. I’ll also, on the personal line side, like the geospatial solution. So how do you know how, how can you be a resource to say you know, there’s, there’s a hail storm coming, these are the things that you should do to protect your home. The hail storm has come, it looks like you may have roof damage right here. Here’s how someone we can get out there to assess that damage. Or we’ve, we have the data and the analytics to say, we know that your roof was damaged and we have already estimated that because we’ve got drone imagery because we, we have, you know, ana, you know, analysis and data that say this is what it would cost here. Here’s the payout for you. A lot of those, those pre automated things are, are definitely solutions that, that we’ve seen entrepreneurs trying to bring, bring to the industry. Yeah. So you’re spot on.

CW:

Awesome. That’s exciting. I love living in the future.

BC:

<Laugh>, same <laugh>.

CW:

I I had a question though about the submissions process, just cuz I know how have some sense of how human brains work. Are people really just taking the top of their inbox or are they like looking ahead emails and saying, you know what, that one, I’m probably gonna quote that one. I’m not gonna quote, I’m skipping it. Like, is this happening or people actually are just minus opening up one email after another?

BC:

I only can assume because I’m not in it every day and what I hear, but I what I hear most of the time is it’s like a first and first out. Sometimes it’s a last and first out, right? So I think it just, they’re really just trying to get to as much as they can. But the thing is, is like, like first and first out, last and first out, you don’t know if those are the ones you should be looking at, right? Like, those may all be like rejects. So like that’s like the biggest problem is like as these submissions are coming in to Michelle’s point earlier, even around triaging, like how do we triage them immediately, right? Because there’s likely submissions that you would reject, like from the get go. Like there’s no way you would ever like write that how there are also like submissions that are like just totally in your wheelhouse, right? And it’s like, this is an easy risk. Like we don’t really need an underwriter to like spend a ton of time on this, like kind of rubber stamp it, like send it out, it’s ready to go. But then there are others that are a little bit more complex, right? So like there’s definitely like triaging that I think the insurance space will start having because they’ll be able to like intake and ingest this data like better than they have been previously.

CW:

Yeah. And like

MG:

This

CW:

Yeah, go ahead.

MG:

No, I was like, Chris, this is not this is not from the underwriting space, but when I, when I worked at Travelers, my role in the agency operations space was to appoint agents to, to on behalf of travelers to write business, right? So that required checking that they had state licenses, l figuring out what line of business they wanted to write and then appointing them wi with, with those things, right? And there was a, a number of other work things that we did like changing their name and, and things like that. And to, to your point, we would get those things assigned to us and it would be, you know, one, one application come in and say, I only need to be appointed in the state of Connecticut, easy five minute work effort done. But I don’t know if the next work stream that comes in is someone asking to be appointed in all 50 states.

And that’s a lot more work effort, right? And so that might have been the first one in, but if it’s, if it’s gonna take me two hours versus five minutes, I’m, I probably, and I don’t have two hours, I’m probably gonna wait. Right? And so that then it, it lingers a little bit. And I presume that from an underwriting standpoint, you triage in a similar fashion of this is something that I can, you know, quick and easy get out. I have, you know, half an hour left. That’s what I can do versus something that says I need, I need to do data calls, I need to validate that this information is correct. I need to go, you know, if it’s a multiline, you know, quote that’s come in, I need to go validate this with, with my colleague underwriter that does that line of business. So there, there’s challenges there. So what, how can you automate some of that? How can you identify upfront, how do you maybe as a result of that, pick where that goes based on expertise or time so that it can get done in the most ef efficient way possible?

CW:

Yeah. The, the decision scientist voice in my head is asking like, do you actually have to open up every bit of data or is there some minimal set of things that will tell you like, you know what, this is a waste of time anyway. Don’t even, don’t even bother processing the whole thing. Or at least put it in a batch later for analytics, right? And store it. But that’s probably a whole other podcast or product or something.

BC:

Well, and like, so like the example Michelle just had the one that she put on the side because it would’ve taken two hours, like may have been the most valuable to the, to the enterprise, right? And like, that’s like, and then you just put that to the side because you didn’t have time for it, right?

CW:

Yeah,

MG:

I think there’s been a, a ton of really great nuggets in here. I think for, for people that are wanting to, you know, understand how to run a pilot, what are the important things a lot for them to think about and also for those trying to sell into an enterprise, what’s the message? The how do you promote you know, what your product can do to the right people to, to get something off the ground. So a lot, a lot to chew on here. And as Chris just pointed out, we’ve probably got a whole list of things that could be their own episodes. So, so

BC:

Many topics coming out of this that we could just dive into, you know,

MG:

So many we’ll have to have you back. But thank you. Thanks for for joining Chris and I today. It was great to have you. This has been another episode of Unstructured Unlocked. Thanks everybody.

Check out the full Unstructured Unlocked podcast on your favorite platform, including:

Subscribe to our LinkedIn newsletter.

Resources

Blog

Gain insights from experts in automation, data, machine learning, and digital transformation.

Unstructured Unlocked

Enterprise leaders discuss how to unlock value from unstructured data.

YouTube Channel

Check out our YouTube channel to see clips from our podcast and more.