Watch Christopher M. Wells, Ph. D., Indico VP of Research and Development, and Michelle Gouveia, VP at Sandbox Insurtech Ventures, in episode 29 of Unstructured Unlocked with Thomas Mandel, Transformation Manager at Cushman & Wakefield.
Christopher Wells: Hi, welcome to another episode of Unstructured Unlocked, Chris Wells.
Michelle Gouveia: Hi, I’m co-host Michelle Govea,
CW: And today I’m really happy to tell you that our guest is Thomas Mandel, transformation manager at Cushman and Wakefield and an aspiring American Ninja Warrior. Thomas, welcome to the show.
Thomas Mandel: Thanks for having me.
CW: Absolutely. Why don’t you start by telling our audience a little about you, your role, and any fun facts. We’ll take whatever we can get.
TM: Sounds good. I’m currently working as a transformation manager within Cushman and Wakefield’s Intelligent Automation, c o e. And then what that practically means is a lot of my responsibilities within the team are looking at the capabilities that we have, both from enterprise automation, the tools like R P As well as automated document abstraction all the way through looking at emerging technologies coming out in the space and understanding how those can help our different business lines and business users across the enterprise, both from a professional development standpoint as well as from a citizen development standpoint, take advantage of some of these tools to make the way that we’ve been going about work easier and better moving forward for the organization.
CW: That sounds like a big job,
TM: So it certainly has a lot of different pieces to it, but certainly keeps things interesting on our end, especially in a space that’s continuing to face change in new platforms and services.
CW: Yeah, I’m sure we’ll circle back at some point along the way to the change and the stuff that’s making it so dynamic these days, but I wonder if you want to spend a few minutes sort of bragging on yourself and your team and talk about some of the things that you’ve accomplished in this space in the last few years.
TM: Of course. So, our team has been working with enterprise automation for a little over four years now and have really been fully embedded across the different service lines and business units within Cushman and Wakefield. And as part of that we’ve taken on use cases both large and small on the enterprise side and have continued to expand our platforms and capabilities. And I think as part of that, it’s a lot of it’s on our end trying to keep up with what vendors are continuing to promote and invest in their products. So seeing the change in how document extraction even in the last three years has changed from where it used to be has been really exciting and how you can layer that in with more broad automation. I think we’re overall cases expanding be you could automate and tackle more simpler, less complex tasks and now the complexity that you’re able to bring in and tackle with these platforms and tools is continuing to increase over time.
And I think it’s an exciting challenge and opportunity of where do you still look at a business process and underlying opportunity and say, this process makes sense for us to put some of these tools on top of it’s feasible for us to do so let’s go after it versus where do you still look and say, this process maybe doesn’t make sense and its current form. Maybe we can apply some pretty advanced and robust tools to put automation on top of it as is, but is that really the right solution or do we still need to do traditional process transformation process redesign? So I think that’s the interesting challenge. As tools have gotten better and more advanced, it’s easier to create the technology solution to solve a problem whether or not that technology solution is the right and best fit overall.
MG: Yeah, go ahead Chris.
CW: I was just saying that makes sense. That’s great.
MG: Yeah, so ditto, really interesting. Thomas, in your role, when you are thinking about the technologies that you’re seeing, is it you and your team that are going out to the different groups and units and saying, we have a solution that we think may apply to this workflow that you have because familiar with those or are they coming to you and saying, we saw what the team helps do over there and we think we have a similar use case that we’d like to run by you or pilot or how does that process work internally for you?
TM: Yeah, it’s a little bit of both, Michelle. So there is a good amount of selling that our team will do in terms of educating our different business leaders on the capabilities that we have and which continues to evolve is either we add new capabilities to our stack or see that our investors’ business is heavily utilizing this platform or technology that maybe our occupier’s business that does a lot of very similar processes, hasn’t taken advantage of one of our capabilities that we know the other service lines are, and that’s existed for a number of years. So we’ll walk them through that journey and make sure that they’re aware of what we can deliver from a capability standpoint. And then, as we get more ingrained within the business units, we’ll also have them proactively raise up ideas of this project that we did with you all in the past worked really well.
Here’s something similar that we think we can go after next or that’s an opportunity. And sometimes that equates to, they know that there’s opportunities out there, but where those lie within the business. So we may come in and partner with them to do a discovery session and say, what are really the five main pain points in your business that are either error prone that your teams are spending a lot of time in and let’s just dive under the hood together with those and do a true process analysis and understand what the team’s doing. Current state bubble up a lot of those findings because at the executive sponsor level, they may have a very inefficient process that the concerns and opportunities just aren’t being raised up to them on. So we’ve seen sometimes it takes that partnership to really dive underneath the hood and just do traditional process shadowing and seeing what’s going on and where those opportunities may lie.
CW: Two things I want to say to our audience. One full disclosure, I’ve known Thomas for a few years now, actually worked with a team implementing indico and some other technologies at Cushman. I’ve worked with a lot of companies trying to do this and I’ve never seen any team more disciplined and more process oriented and not in a way that it’s not bureaucracy. Everything is too purpose and it makes it very efficient compared to what I’ve seen in the rest of the market. So if you’re trying to get started on something, trying to automate something, reach out to Thomas on LinkedIn, he’s a good person to pick their brain. The second thing I would say, and Thomas, I’d like, I’d like to unpack that process a little more. We talk a lot about paper heavy processes in the insurance industry these days, which obviously apples and oranges from what you all are doing, but they’re both fruits. There are some similarities. And as for someone out there in the audience that’s thinking about automating parts of an underwriting process or a claim submission process, what are the steps that you would recommend for them to get started as they think about that automation and that process management?
TM: Yeah, I think when evaluating any opportunity, it’s really understanding again, what makes sense to automate and what steps of the process are ready to be automated. I think too often we tend to have the association that automations an all or nothing game and that if it can’t do a hundred percent of claims processing for example, then it’s not valuable. Even though you may dissect that and say we’re saying that it’s not feasible because of this 10 to 20% of the process, it’s just really complicated or that we could build in to the initial scope, but it’s going to increase our timelines by several months and maybe there’s not the initial appetite to fund of projects that long to see the return off. It’s having that mindset of understand that automation and any type of change is a journey. It’s a means to an end rather than accomplishing everything at once.
And suppose you’re able to come in and say. In that case, this process is right for opportunity, it’s right for automation, maybe we only do 50% of that process as a wave one to just prove out the approach, prove out the technology that we’re looking to implement and then can add on incrementally 10 to 20% more of the process at a time where it makes sense as enhancements. And if you only get to 70% of the end-to-end process automated, but you’re saving significant time for the team overall, then realizing that that’s a win. It’s not getting to full automation. It’s about how you create incremental value and how you ensure that you’re realizing that value as you go through and deliver a segment of that process.
MG: I want to hop in here too. So one of the things that I did in a previous role at an insurance company was not quite as robust as your role and your mandate time, but part of it was being the key relationship manager. When we would look to partner with InsureTech companies and bringing them in and saying, we identified the business case, there’s senior level sponsorship to move this forward. How do we test the solution? So are we doing a P O C or a longer term pilot? What are the metrics and the KPIs that we’re trying to measure against what level of, I’ll use that term accuracy, Chris, that I know kind of grates every once in a while, but how much better does our process become by using the solution? Is it more accurate? Are we more precise? Are we more efficient? And I did a few of those.
I’m sure I stepped on a number of landmines along the way to come up with a lot of the learnings and ultimately recommendations to senior management of who to move forward with and what does that look like and then how do you build that workflow so that it’s kind of more of a permanent staple in the process within the organizations. I imagine that the answer that you just gave Chris was full of insights that I think must come from many, many, many iterations of working with different business groups, working on different projects that had different scales and scope. Can you talk a little bit about internally, how do those things differ? How do you scope out a project? What are the KPIs and the metrics that you care about? Is it customized to each business unit? How long do you run these processes for? Because to Chris’s point, you’ve got a very robust methodology and a framework that seems to be repeatable and obviously very successful internally and that takes a lot of work to get there.
TM: Yeah, that’s a great question Michelle, and I think you hit it exactly on the head there that it’s a process to get to the point that we’ve reached. So early on when we first started, our roadmap and strategic priorities were continuing to change. We knew opportunities that were out there, but we were building our roadmap several months at a time to where now we’ve matured to where we’re building our roadmaps a year out in advance and looking at for right now, we’re starting to plan for what are the strategic initiatives and projects that we want to go after in 2024. So I think it is that learning journey of business units aren’t going to commit to doing a lot upfront because they understandably want to be able to see the value that you’re delivering. So how can you identify what some quick wins are, especially if you’re a newer team or organization looking to spin up an automation practice and then how do you quickly turn those quick wins into additional use cases?
And that partnership element throughout the organization is critical to that success. So coming in and understanding right away, if you’re going to run a pilot, what are those key things that you’re measuring? So what is the value that is expected to deliver? How are you going to track and define that value? And then how are you categorizing it? Is it an internal cost efficiency play? Is it going to be a hard cost savings so that actually helping you potentially gain customers and attract new customers because you’re able to be more competitive in the marketplace? And if so, how are you going to try to attribute which percentage of customer gain is attributable just due to normal business growth versus due to this one specific initiative that you’re running? And I think those are really difficult questions for many teams to answer regarding the tracking piece, but you need that consensus with the key stakeholders of how you’re going to measure and report out on that.
So then there’s alignment on if the effort was successful or not. So you don’t run into a situation where the project team delivers what they had thought the expected scope was. They’re seeing transactions run through, everything’s great, and then on the business side, they end up not using the output stuff of it, or maybe it’s not usable for them because the scope that they thought that they wanted doesn’t actually meet what they really need to deliver. So that ongoing relationship I think is critical in terms of it may take us anywhere from one to three months to fully implement a project from when we start developing it. And then once something goes live, I think there’s a sense that it’s live. Once it’s stable we can be hands off, but really what we’re seeing is you need to have that continual follow up with the business to make sure that it’s still going to be usable because their needs are going to change over time. And that doesn’t mean that what you had delivered originally isn’t still valuable. It may just be that it needs to be tweaked or enhanced slightly to meet their changing needs over time.
CW: Yeah, I have an anecdote that ties together Michelle’s accuracy point and your point about the people management side of this, and I’ll follow that with a question on that topic, was working with a client, they had been running in production for a while. They had lots of human review data and they wanted to inject it back into the model, and the model got more accurate in terms of just the raw numerical output, much more accurate. But the people doing the data review got slower because that more accurate model changed the error modes. And so now the patterns that they had gotten used to as they were going through and reviewing model outputs changed and the overall throughput went down even though the model was more accurate, which was super counterintuitive to everyone but the people who have worked with artificial intelligence models for many years. So Thomas, talk to me about your approach to change management. If you add a new automation, what are the steps that are done to make sure that the people downstream from that can still do their jobs and even more importantly, be happy about doing their jobs
TM: For the change management piece? And I think I like that anecdote that you gave Chris of you can deliver the best solution from a technology standpoint, but if the end users don’t use it or don’t want to use it, then it’s not valuable at the end of the day. I think that’s the hard reality for development teams is that it’s not only about delivering a product that’s great and is going to be value, but a product that users want to use and interact with. And so from a change management standpoint, for us it’s really, I think it starts with making sure that you have that sponsorship within the business. So both at the manager and above level, that the folks that are directly overseeing the day-to-day users of the platform understand and are aligned that what is being delivered is valuable and is going to make people’s lives easier.
Because the fact of the matter is if somebody pushes a change on you, you’re not finding it valuable. Maybe it’s harder to do initially than the process that you were before. You’re likely going to bring that up to your manager and their response is, I don’t know why we’re doing this either it doesn’t really make sense to me. Then all of a sudden that change is going to fail because they’re not getting that positive reinforcement of let’s take a look at what you’re doing, lemme understand what your concerns are and see is it just something that I need to coach you through how to do things the new way and what the new approach is, or is it this is actually good feedback that we maybe didn’t find during initial requirements and testing. Let’s see if we can do something about those to improve it because it makes sense and is valuable.
And a lot of that’s just making sure that we’re enabling the business as well. So helping them to provide trainings and understand what that future state looks like, so then they can instill that out to their teams and make sure that they’re enabled. I think that piece is really critical as well of helping the business to craft and create those resources that they can give out to the teams and use as a guide of here’s how we go about the new way of working with this new process. And we have some materials that we can reference. So every time someone has a question, they don’t have to email someone within their leadership org or someone within the project team, they can self-service some of those requests and then bring up additional questions that they have outside of that that may add back to those materials over time.
CW: And as you’re assessing projects, is that the sort of alignment at the management level and the flexibility to change on the team? Is that part of the assessment or is that so far out of your control that you can’t really put a number to it?
TM: It is part of our assessment in terms of making sure, and that’s why for all of our projects before they get approved, we’re reviewing and putting together a business case with the sponsors within the business to make sure that they understand as part of moving forward with this. It’s not just they’re signing off and saying that it’s a good idea, go build it and tell me how successful it was. We’re signing up as a partnership together in that they want to commit to this project and move forward with it. Here’s the expectations that we have and need from them and their team as part of that in order to make it successful.
MG: I think that’s a really key point, Thomas and I probably is, Chris is much more familiar than I am, but key to the success that you all have had in running a center of excellence this way is, and a lot of things that I’ve seen is everyone thinks that innovation happens quickly, not that there’s a year of planning that goes into trying to get something through the door that would ultimately lead to that innovative change in process and product and something. And I feel like where a lot of this falls apart is where we’ve figured out what you needed, we’ve delivered it to you, now you go figure out how to make sure that it continues to work.
To your point, people have to be aligned on wanting to make it work. It needs to make sense. It can’t just be technology solution looking for a problem. It needs to actually have solved a problem. And I wanted, so this is all leading to a question, go back to something you and Chris were talking about at the beginning of going around the organization and promoting some of these pilots or things that you’ve run. Do you find that that is a key thing for new groups to want to partner and try these things to showcase that these things have been implemented and that there is someone kind of as a backstop to continue to support it even after that pilot phase and that decision approach has been made and then as you all move on to support the next group that there is still something there to help them?
TM: Yeah, so I’d say in the first part of that in terms of taking things out from things that we delivered in the past and presenting those out to the business, that’s I think critical to success of automation programs in terms of scaling and continuing to grow. Because as we know just in the sales space, anybody can spin anything that they want and sell something that is going to appear really nice. And I think because of that, we’ve tended to become more skeptical over time of the automation team coming to a new business line and saying that we can do X, Y, and Z isn’t going to get buy-in anymore unless it’s truly they’re trying to get off the ground and they don’t have any prior experience to be able to show. Because once you’ve been embedded in an organization, what those business leaders want to see is what have you delivered already in the organization?
How did that go in terms of meeting, not meeting or exceeding expectations? And then if it didn’t meet, what did you learn from that and how do we know that that’s not going to happen again with this use case? And then the second piece of that, just wanting to know, again, it’s going to be a long-term partnership and support. So it’s not just your team’s going to come in, develop something, say that it’s stable, and then if a system that it integrates with goes through an upgrade, which is inevitably going to happen, then the business has to go back to you and requests that they get funding for a new project to keep it running, understanding how that system or automation’s going to continue to persist as business change occurs as system changes occur or it’s just the needs of the business users change as well.
And I think the key piece of that is it’s really understanding what that means from a relationship standpoint. Since traditional automation support I think has been a little too focused just in terms of looking at metrics in terms of runs and seeing, are we seeing errors in the system? Are we seeing jobs fail in the system? If no, then everything’s running great, but you may not be seeing those errors in the system, but maybe the output that users are getting that says it was successful isn’t actually including as much detail or as many data points as it was previously. So things are falling off and making it less impactful and less usable for the business. But just the overarching job schedules and metrics aren’t telling that piece of the story. So I think it’s important to understand that a lot of times these projects start with getting an understanding from the end users and the business of what they need and you’re delivering against that.
And production support really needs to have that same mindset in place of, it’s not just a numbers game and monitoring logs and triggers in the system. It’s continuing to have those conversations with the end users and getting their affirmation that yes or no, it’s still meeting their needs. And if no, when did that change and understanding why that change so that we can not only correct for it, but try to anticipate similar challenges in the future and get ahead of it so we don’t have periods to where the automation effectively was down for a month without us knowing about it.
MG: And Thomas, one other question from me and I’m thinking, so I’m trying to equate everything that you’re saying back kind of into insurance land and what I’m familiar with. A lot of times when there’s these types of pilots or exercises to get senior leadership to buy in, quite often there are conversations of, well, who else does this benefit? And Chris and I have talked about this a lot where a lot of times if you are automating something in the claims workflow where maybe data that was locked into that workflow before can now get extracted. Now the underwriting team can access that data and it’s kind of this circle of everyone’s benefiting. That’s all. Great. And so the first question is are there places in your world where that happens where a process automation initiative can support multiple groups at once? And the second part of the question is, but how challenging does that get when with scope creep and making a project too large to actually become digestible and therefore result in something that feels like it’s feasible to implement?
TM: Yeah, we certainly run into those use cases where we actually have an automation and a process transformation initiative that was looked at initially from a data entry time saving standpoint. So how do we take data and enter that into our corporate application and just save the time from the data entry layer for lease agreements? And then had realized later on that there’s actually internal teams throughout the company that are heavily reliant off of that data in the system that are separate from the teams entering the data. So while we were just focused on it from a cost efficiency and time savings play, and we realized that there is also an opportunity if we looked at it from a data quality layer in the data that we’re providing in that system in terms of the consistency and completeness of it. And I think when you run into those use cases, it does create a challenge in terms of now trying to manage two groups of stakeholders to where if the data entry team thinks that they only have 20 fields that they need to enter and you’re getting feedback from the downstream stakeholders that they’re not able to complete their analysis because they don’t need those 20 fields, they need a total of 30 fields.
So there’s consistently a gap of 10 fields that they’re going back through and populating manually. And it’s really trying to understand who overarching in the organization can help address that challenge. Because if one team’s asking another team to fund work that’s not directly benefiting them, that’s where a lot of that challenge with the scope and complexity comes in to where maybe the easier path is just to say, this is what we’re focused on, we delivered it for this team and move on. But the and solution is how do you understand that you have two teams within the same organization trying to drive towards the same goal and it shouldn’t matter which team is being asked to perform that. If it’s work that we’re doing today, where does it make sense for that to live and how do you break down that silo of it’s not two separate teams, it’s one organization trying to perform this process and how do we bring the two of them together and have that conversation about what is it going to add if we want to request these 10 fields be added to the standard?
That team also providing the justification for why they need those and then having that conversation about where do we end up landing? Maybe it’s somewhere in the middle of it makes sense for us to add five and it’s going to benefit both teams and that really gets us to where we need to be. But thinking through how the data’s going to be used, I think it’s becoming just as important or even more so than just making sure that it’s there since that’s where a lot of the intangible benefits of automation come to play is how do you get value out of your data and how do you get better reporting and insights to make better decisions off of that?
MG: It sounds like that might cross over a little bit into that it’s now process engineering initiative, right? Where you need to look at end to end and then identify where some of these automation projects can start off make incremental value until ultimately the whole workflow is benefited from those small changes over time.
TM: Exactly, and I think that’s where the challenge in the process engineering lies is that I think it’s getting harder and harder to find where a process truly starts and ends with those downstream cases. Because if you ask one business line, they might say, as soon as the data’s in the system, then our process is over and they may not even be aware of who outside of their team is using and relying on that data. So that’s where I think that it’s just the challenge a lot of times of uncovering and unpacking what does the broader true end-to-end process look like from who first touches this process and data and information and who ultimately uses that, which could be a separate team, it could be multiple different teams potentially. And how are you able to identify those downstream users if it’s not clear from the business unit that you’re engaging with point of view is their system access components that you can work with the application owners to check, or a lot of times if that information gets driven to an integration to a centralized reporting hub, how can you try to identify who those downstream stakeholders are and make sure that you’re at least aware of and incorporating their needs into the initial design?
CW: You don’t have to answer this, but how often is the answer to the question of why do you need these 10 fields? We’ve always had those 10 fields.
TM: I think that does happen often enough. I think it’s always a challenge for people to think about work in a dynamic way of not just we need these because it’s another common use case that we may get along that line, Chris, is we need this because it’s a client specific requirement. So a client had asked for this field, they’ve asked for this data points, so we’re providing it. And I think we don’t always necessarily push back on that to understand just how are they using that, do they really need that and is it providing value for them? So it could be something that they’ve just always had, so they include in the list of things that they asked for and may or may not be using. So I think it’s always trying to think through those elements critically and understand that while getting as much data that you can out of a document is great, if you’re not going to use it downstream, it’s just leading you to be more prone to errors or have to spend more manual time in that process if it does require some level of human review. So thinking critically about what’s actually adding value from a data standpoint versus what are additional fields that we think may have value, but in reality they don’t.
CW: I absolutely adore how sort of friendly but adversarial you are with the business owner because so often business process owners and they’re measuring things, they’re like a drunk looking for their keys and of course you look under the street lamp because that’s where the light is, but who knows if that’s where the keys really are. And so forcing business process owners to be very thoughtful about what number directly ties to value and is measurable and is reliable, super important. So I love it
TM: And it’s one of those I think continual challenges of just like anything, when you get super close and in the weeds with something, it’s harder for you a lot of times to see that bigger picture versus someone coming in externally and looking at it under a microscope and saying, I don’t understand this because I have no prior knowledge of why you’ve been doing it this way, way, so let’s have that conversation. So I think a lot of it’s just that willingness to take an outside perspective and walk them through the process and understand that there’s going to be times that it may not make sense at first, but then when you dive deeper into it, it’s completely logical why we’ve been doing things the way that we have and other times there may be opportunities to change that, that were just hard to uncover being too close to the day-to-day of how it’s been run for a number of years.
CW: One of my favorite things to do with new hires is to talk to them after a month of working at wherever I am and ask what have we been doing? That’s stupid. I love that that outsider point of view goes away over time, so I try to take advantage of it as often as I can with the last 10 minutes or so. We talked early on about the sort of dynamic landscape of technologies in this space. First a general question then we can dig in on some specific stuff. What are you seeing out there? What’s the ground moving under your feet like nowadays?
TM: Yeah, I feel like some of the biggest trends that I’ve been noticing that I think are really interesting is just the democratization of technology and technical skillsets. So platforms over time I think have been reducing the barrier to entry in terms of it’s easier and easier for a person that’s more functional within the business to take a automation technology platform and build something on their own because it doesn’t require traditional coding languages or knowledge of Python or Java, something that may take you years to really get under your belt and master. I think the ability of platforms to make things more user-friendly to non-technical folks is really interesting and exciting in that it’s easier for people to upskill themselves and take advantage of platforms. And I think that’s only accelerating with the large language models and open AI and chat GT and tools like copilot or embedded coding helpers to where you can say, here’s functionally what I want to be able to do.
And some of these platforms are able to take that, create an outline of a workflow for you, and then you’re really just stepping through that and modifying it as needed. So I think both from making it more accessible to end users as well as for someone that is more advanced and is more skilled in development, it’s accelerating the amount of work that they’re able to generate because they don’t have to go through all that nuance of building the framework of a workflow anymore. Potentially they’re able to say, here’s overall what I want to do, create the 25% of the base of the code for me, and then I’m going to go through and edit and refine and test that on my own, but I’m not spending time doing really those non-thought provoking activities to where I just instinctively know based on what we need to do, I need these 20 activities in this order in a workflow, I can now have that brought in for me.
CW: Yeah, it really puts the onus on less on answering questions and more on asking good questions. Right. That’s the skillset that’s emerging. Since you invoked the name of our favorite topic chat, G p T, and maybe you can’t answer this if it’s too sensitive, but how is Cushman wrestling with teams like yours using this technology to solve problems?
TM: Yeah, I think for us it’s really understanding how we can take advantage of a capability like this, but how do we do it in a way that’s effective and also secure is really the biggest thing. I think the downside of the opportunity that this has is again, just the perception that technology can do everything for you and being able to put in a prompt to a chat bot and either have it generate a email template or a newsletter or a snippet of code and saying, I’m going to directly take this output and use it as a deliverable. I think that’s the hard balancing act of this technology is that it’s a great accelerator of work, but it’s not a replacement of work. And I think that distinction for a lot of people is difficult to grasp and really difficult to understand of how to use these tools that have a lot of exciting capabilities, have capabilities that are very user-friendly and easy for people to quickly grasp and understand and how do you take those and make your work better and more efficient and use them in the right way so that you’re not relying on it to do everything for you, but you realize where it can add value and where it can save time.
But also know the level of due diligence and spot checking and review and editing that you need to do to make sure that it’s actually returning to you what you’re looking for.
CW: I’ve been working on tools that will allow you to write code with these large language models against a particular code base that isn’t a part of its training data. And what I’ve found is the skill that’s becoming critical is finding the code base that I think can solve the problem. There are a lot of code bases out there, and once you find that code base, then prompting to get code out of it. It’s not easy, but it’s pretty well understood problem. But finding the one that you think will actually solve the problem from scratch, that’s becoming a really useful skill.
MG: That’s way too technical for me, so I’m going to bring it up to a non-technical level on this linker. So Thomas, what I’m seeing in the insurance industry, similar to, and I’m curious what your group specifically may be hearing from groups within Cushman and Wakefield. Are they coming to you and saying, there’s so much stuff that you can do with chat G P T. What have you found? Is part of your role now to kind of go through and vet some of these solutions that are out there that are maybe starting up brand new that say they have a use case for you? Or is it a little bit more of what we’re seeing in the insurance and well, one, we’re seeing that, but two, there’s a real focus on what are we doing in our day-to-day that I could continue doing, but now I want a real business use case for, so it’s not that it makes my job easier, but there’s a real use case to apply AI to a workflow or maybe it didn’t exist before. And so I’m just curious, the groups that you’re working with, are they coming to you with those specific questions in mind, like those specific use cases and you’re now seeking a solution, or is it still a little bit of you’re kind of putting a thesis together and then you are trying to identify where within Fishman and Wakefield may be an initial test group may emerge as the first candidate for an AI workflow to come in?
TM: It’s definitely been a little bit of both. I think because the capabilities have been so well marketed and so well transparent out there that a lot of people within the business have been able come up with and think through how AI can change and improve upon things that they’ve been doing and they may have ideas how to solve that. And then we’re also looking centrally with our t d s organization about how do we identify tools that are coming out there and also say out, is there a fit for this within our business? So what does the size of that look like? And part of that’s just understanding what all of our existing vendors are doing to incorporate it. I think that’s the second piece of this that’s really interesting is almost every technology platform that I’ve talked to recently seems like they’re at least thinking about embedding AI solutions into their application.
And it’s really become critical differentiator and selling point for any technology platform at this point to be able to say that they’re doing something with it, whether or not it’s the best fit. And I think that’s where that centralized view is really trying to help us understand if one of our applications is going to embed a capability within it, does it make sense to use that? Does it make sense to build our own similar solution in-house that maybe we can tailor specifically to our needs? At what point do we say this is an initiative to where we’re going to let an end user in the organization take advantage of and become a user of one of these AI powered tools and build something on their own? Those are all things that we’re actively looking into and exploring that right balance and the right approach of how do we take this exciting set of capabilities and make sure that we’re applying it in the right way to the right problems, that it’s going to be valuable for us and valuable for the enterprise, rather than just taking a new and exciting technology and either letting someone use it for maybe a use case that it’s not so great of a fit for, or maybe that simpler and traditional automation tools could solve better and more reliably for them.
CW: That’s great. In six months, the Gen AI hype wave has washed ashore on the enterprise landscape. We’ll have you back and talk how you sorted through the noise along the way. While this has been a fascinating episode of Unstructured Unlocked, I have been your co-host, Chris Wells. I’m co-host Michelle Gove, and we’ve had a great conversation today with Thomas Mandell, who is a transformation manager at Cushman and Wakefield. Thomas, thank you so much for your time and your insights. I learned a lot and I think the audience did too. Yeah,
TM: Thank you. Best for having me on.
CW: Absolutely. Take care.
Check out the full Unstructured Unlocked podcast on your favorite platform, including: