
Evolving the Enterprise
Welcome to 'Evolving the Enterprise.' A podcast that brings together thought leaders from the worlds of data, automation, AI, integration, and more. Join SnapLogic’s Chief Marketing Officer, Dayle Hall, as we delve into captivating stories of enterprise technology successes, and failures, through lively discussions with industry-leading executives and experts. Together, we'll explore the real-world challenges and opportunities that companies face as they reshape the future of work.
Evolving the Enterprise
Navigating Process Improvement and Automation with Carrie Nedrow, Management Consultant at Enduring Solutions
In this episode, join host Dayle Hall as he engages in an insightful conversation with Carrie Nedrow, an experienced consultant specializing in process improvement and automation. Throughout the discussion, they explore the key factors that contribute to the success of process improvement initiatives and automation projects in organizations of all sizes.
From the importance of stakeholder engagement and transparent communication to the critical role of data dictionaries and the challenges of integrating disparate systems, Carrie provides valuable insights and practical guidance. She emphasizes the significance of understanding the goals and objectives of automation projects, setting realistic expectations, and addressing the people aspect of change management.
Carrie shares her experiences working with both large enterprises and small startups, highlighting the similarities and differences in their approaches to process improvement. She also discusses the exciting potential of generative AI technologies while urging caution in their implementation, stressing the need for thorough evaluation and fact-checking.
Whether you're a business leader, project manager, or someone interested in the world of process improvement and automation, this episode offers valuable perspectives and strategies for successfully navigating the complexities of connecting the dots in an ever-evolving technological landscape.
Sponsor
The Evolving the Enterprise podcast is brought to you by SnapLogic, the world’s first generative integration platform for faster, easier digital transformation. Whether you are automating business processes, democratizing data, or delivering digital products and services, SnapLogic enables you to simplify your technology stack and take your enterprise further. Join the Generative Integration movement at snaplogic.com.
Additional Resources
- Follow Dayle Hall on LinkedIn
- Follow Carrie Nedrow on LinkedIn
- Learn about the Evolving the Enterprise Virtual Summit
- Turn ideas into production-ready workflows and data pipelines with Generative Integration
- Back to basics: What is iPaaS? | What is Data Integration? | What is Application Integration?
Navigating Process Improvement and Automation
This is the Automating the Enterprise podcast. We bring together thought leaders from the worlds of data, automation, AI integration, and more. Hosted by Dayle Hall, we'll share stories of enterprise technology successes and failures through a lively discussion with some of the leading executives and experts in the industry today. This podcast is sponsored by SnapLogic, we bring Intelligent automation to your enterprise.
Dayle Hall:
Hi, you're listening to our podcast, Automating the Enterprise. I'm your host Dayle Hall. This podcast is designed to give organizations out there the insights and best practices on how to integrate, automate, and transform their own enterprise. Joining us today, Carrie is a seasoned business consultant with expertise in digital transformation, process optimization, change management, and all those good things that most enterprises are going through today. She has 20 years experience, she's helped a bunch of companies of all sizes navigate through major transitions and helping them improve all their operations along the way. Please welcome to our podcast, Carrie Nedrow.
Carrie Nedrow:
Hey, thank you, Dayle. It's great to be here.
Dayle Hall:
Great. Thanks for joining us. Before we jump in, what we usually try and do is give the listeners a little bit of background, talk us through some of your experience, and then we'll really dig into some of the projects.
Carrie Nedrow:
Hey, I'm really excited about the topics that we're going into today. My passion and my background is all in being a process nerd and efficiency geek. I've spent easily 20, 25 years helping companies find efficiencies in their organizations and working through some really rough transitions in their tooling, and their platforms, and in their organizational structures to support their business strategies. I've had terrific experiences working with some Fortune 50 companies and Fortune 100 companies over the last 30 years. I am currently an independent consultant and enjoying being part of the larger world. Okay.
Dayle Hall:
That's good. Again, this makes it authentic, Carrie, don't worry about it. Okay, so let's look, let's dig in a little bit. I liked what you said, did you say process nerd? Is that what you call yourself?
Carrie Nedrow:
I believe process nerd was what I said.
Dayle Hall:
I've never heard that terminology. I like that. There's nothing wrong with being a process- let's start with something that's to enable processes, to make sure that you can make the right changes, you have to start with a little bit of looking at what organizations currently have in terms of their processes. But obviously, this major topic right now is the data that they have. When you talk to people for the first time, these organizations, the clients that you may have, where do they usually start? Do they usually come to you and say, look, we know we have to do something, but we don't really know what? Do they have very specific guidance on processes that they're going to focus on and you can help them clean it up, accelerate their business? What do you usually start with a project?
Carrie Nedrow:
Typically, a C-level person will reach out to me and they will be expressing a certain level of anxiety about an outcome that isn't happening for them. It could be a team is performing very poorly, or they've overinvested in some platform and it's not delivering, or a vendor is not meeting their expectations. In my process of helping them define the problem statement, it is rarely their initial assessment. It's more often that they are not doing a deep root cause analysis with the people who are actually doing the work.
And it usually is in combination with having some pretty clear strategic objectives but not having succinct and actionable, measurable dashboards and quantitative analytics to help them make decisions. They will be working under the premise that something is failing, or their team is terrible, or that a vendor needs to hit the highway. Sometimes all those things are true, but that's not going to solve the problem. Typically, they're just trapped in not having clear and actionable data that they can actually say and point to, here's what we're solving for, and here's how we know we're getting close to it or we're too far away from it.
Dayle Hall:
You think that begins with understanding what they were trying to do in the first place? Or do you think a lot of these organizations and clients, do they not really define what success looks like up front?
Carrie Nedrow:
I think companies are getting a little better about what success looks like. If this were 10 years ago, we'd say, oh, this is just a problem. But books I've really invested in, oh, let's talk about what success looked like, let's try to get a little finer point on that, and let's spread that out to our organizations more broadly in a way that they can understand it. It's proving results. People understand we need to have mindful spending and budget cuts and whatnot. Let's be very articulate about that. But why are we doing it and what is the benefit we will get from it other than our shareholders are happy.
I think that people are starting to see better communication from the leadership about what those strategies and goals are, much better than a decade ago. But where things get tricky is bending over backwards to try to produce a number that represents those goals and not having that actually be the fulcrum point that tips the truth to we're heading in the right direction or not. This is where I see a ton of companies really struggle when it comes to data management, reporting, analytics, and then using all of those tools to help them make decisions, is they think they know where the source of truth is coming from, but what they may not realize is how difficult it is for their teams to pull that data together from as many as 20 systems or 30 systems just to calculate a single data point that would be their measure of success.
Dayle Hall:
I don't want to minimize the challenges that organizations have, but is it typically the data is wrong, you don't have the right people or tools or processes? Or is it that it's just too hard to pull the right data together in the right way? What is the one thing that you'd say, it's usually this, it's usually access to the data and know what to do with it, it's usually the data is just garbage? I know a lot of organizations have that. But is there something that is just across all these organizations that usually this is one of the big challenges?
Carrie Nedrow:
Yes. The answer is yes to all of the above. There's always garbage in garbage out problems. One of the other ones that I see is that data in one system has the same label as data in another system, but they don't mean the same thing at all. They're very different definitions. And you get a very clever person in the room, who's really smart, by the way, say, let's pull this stuff together. And then they go from team to team, or system team to system team, and say, let me pull all this data, I'm going to manage it all, there's really not what they do. They may not articulate the goal expectation of how that data will be used, and so it gets misused. That connection point is often not well administered.
A lot of companies, because so many companies have a software addiction, they just keep buying software. They don't mean to, but it happens. Every week, there's a new piece of software in the company. They don't have a standard that says, once we decide we're going to keep this piece of software, we're going to send it over to somebody to build into the data dictionary of the company. A lot of companies don't even invest in a data dictionary anymore. Maybe they'll do things for their compliance and risk management, a few things like that, but they're not going to do it for their daily operations. They'll do it for their customer data. They'll do it for maybe their employee data. But they won't do it for their transactional data that happens inside the enterprise.
Dayle Hall:
Interesting. To talk to me a little bit about because I'm familiar with the term, but I haven't really dug into it. So tell me, what is a good methodology for a data dictionary?
Carrie Nedrow:
You've got a great data dictionary when anybody in the company who wants to make a request or ask for a report, because let's face it, getting data is really tough, can just go and look up on some online tool that tells them, if you see this field and it's from this system, this is what it needs. There are a gazillion tools out there you can write your own processes to go and hunt and search and combine data sources, but it's back to the garbage in garbage out. If your source system hasn't got a well-defined data set with all of their definitions, parameters organized, then you're just going to send over a bunch of fields and maybe some cryptic definitions with them or descriptors.
Companies are getting a little more savvy now, realizing that they are not standalone tools. Some of our big enterprise tools, we love them, right? They're great. But there's so many niche solutions out there that companies want. But almost every company is tacking on something to their central core nervous system out there. It doesn't matter if it's like, oh, here, here's my finance tool.
I won't name it, but there's a finance tool, and it's amazing. We do AP and receivables, and we do manage our expenses out of this, and so on and so forth. We do our annual plan. But it doesn't do reporting the way we like it. And our shareholders want to see this kind of thing, and so then they buy this thing that they bolt onto it. And everybody's like, oh, that's great. That's designed to consume data.
But imagine doing that in HR, doing that in marketing and sales, and in your software development process. Everybody's got a bolt-on,at least five or six of them. And often, they're not even bolted on. They're like a little island over on the side. And they do one thing, and that team is so happy with that thing until they have to get the reporting to match up to the rest of the company.
Dayle Hall:
Let's talk a little bit about the term automation, right, the focus for this podcast series. There's very different types of automation, and there's different levels. When you talk to your clients, do they come to you with they want to do more automation and they know what they want to do? Do you help them understand that sometimes you have to understand what's being automated, you have to understand what your goals are for that? Because what we see a lot is if you don't go into it with a good- we talked about this- outcome with something that they want to solve for, that just coming in and looking at places they can automate doesn't necessarily deliver what they expect. This comes back to setting the expectations up front, being communicative on what the outcomes should be.
But then I think about what you said on the data dictionary side. The need of a data dictionary is to try and explain where things may be a bit misunderstood and where you can get the data. Does automation help? Or does sometimes that make it harder if the systems don't talk to each other already? Does it cause more problems sometimes?
Carrie Nedrow:
I heard a couple of questions in there. So let me jump on one, and then I'll move to the next one. The first one I heard is how did they come to me, what are they asking for. Usually, what I hear is something is broken, and it costs too much money. But the root cause there is it's broken, it costs too much money. Then there will also be the dream of why are we doing this the way we're doing it. I want to get some maximum efficiencies and repurpose people for other work. That’s the classic.
And then the second part of your question was do they understand that there's a value to evaluating the process and looking at how things are working prior to jumping in and just solving. It is my experience that most of them have already jumped in and tried to solve, and it's not worked out well. When it doesn't go well- one company I used to work for, my nickname was the garbage girl because I would just come in and clean up messes that other people had made, which is great, fun work. And I do that all the time.
When you want to automate something, you really want to automate something you want, you first need to say, what is our objective and what is our goal. And a lot of companies have invested in that. They're like, we have this number of efficiencies we're looking for, we have all these manual tasks. Maybe they have their plethora of Design for Delight and Six Sigma people and all that kind of stuff going on. And we're happy for them. Yay. Usually where they get lost in the sauce on this is they will have an enormous idea of what to do.
Dayle Hall:
Is that a good thing or a bad thing?
Carrie Nedrow:
Oh yeah. And then they have a great program manager. They've got good project managers. They've got good technical leads. They've got good, the disappearing business analyst, which I'm sorry to see that role hardly exists anymore in a company. But maybe they're called a product manager or something, but it's not the same. They'll have all this team there. These are big companies or these are medium-sized companies that have this dream, but they haven't really identified the sequence of events that are going to get them to the outcome. And I've done this with companies that have quarter million employees. I've done this with companies that have had 20.
Where I usually come in and help them understand how to get back to their dream and not have it be this giant catastrophe is what is the one thing that will make a difference tomorrow. And then start to work your way backwards through that. I usually spend one week working on understanding the big vision, the one thing that's going to make a difference tomorrow, and then building a critical path that gets them out of their hole back into the sunshine, and then moving toward their big dream. My goal is to have one win really fast so that people are excited about, oh, this is not a colossal failure, we didn't just lose whatever massive investment they put into it.
And again, nobody likes to spend money on internal efficiencies. They'd rather spend it on product. They'd rather spend it on their customers. They'd rather spend it on revenue-generating activities, let's go marketing, right? It's like game, let's go marketing. But when you do get those precious funds and the precious resources to work on internal efficiencies and automation, it's so easy for it to go sideways because these software companies will give you the song and dance. We're going to connect all your data. We're going to do all these things. You can have one system talk to another. This is all going to happen. And it's true, they can. But often you don't resource those things properly. Often, the implementation on them goes sideways because you've got a third-party vendor who has done it before but doesn't know your organization. You don't have the right people working with them to set them up for success.
Often I find myself negotiating between the internal resources and external resources, or the internal team connecting the dots together, getting them all realigned on what is possible. And then are we having a technical issue, or are we having a mental issue? That's the end of the story.
Dayle Hall:
It's a good point. Without talking about specific tools, but how many of the organizations you talk to that have tried to do this, maybe they've done it the right way up front in terms of outlining success criteria, maybe they haven't, but how many people do you talk to that have tried this with a tool and it's just either not delivering what they thought, or they just don't see the value of it? How many people go down this path, and then just left, not disappointed, but just they couldn't do what they wanted to, or it's not delivering what they expected?
Carrie Nedrow:
I'm going to make a lot of enemies here.
Dayle Hall:
The truth is there's never the enemy.
Carrie Nedrow:
I would say 80%, maybe more. I would say, for smaller companies, it's naivety, believing what they were sold by the vendor. It's not that the vendor was trying to mislead them. The vendor simply is giving them their pitch, and they are not used to hearing a pitch and don't understand what that means when they say, here's the miracles that will happen in your company. And they don't know how to ask, what does that take to actually get that up? Who's going to do that? How does that work? What resources do you need from my team? What’s the time commitment? They don't know how to ask those questions in an effective way. They try, but the reality of it doesn't settle in.
Dayle Hall:
Yeah. Just look from our side, we have a lot of customers that come to us that have tried to put an RPA process in process or tools. And what they've realized is they can't go as far as they need to because whilst they want to automate certain things, there's a key piece missing around actually integrating the right data first before they can get the automation piece. And what happens is some of these companies have grown massively quickly, acquired a bunch of customers. Again, a lot of the people that talk to us realize that they're not getting the full potential. And they're paying a lot of money. I think you probably see some of this directly, which is then those types of clients will come to you and say, what are we doing wrong, or how are we going to we fix it?
What's typically the guidance that you give them if some of these things are failing? Do you go back and go, let's start from the beginning on what you were trying to achieve? Do you really look at the process? Do you then point out, look, you're not pulling in the right data? How do you usually manage that? Because I can imagine, there are probably some sensitive people in that room that have made some of those decisions.
Carrie Nedrow:
Yeah. I have this phrase, people talk about change management, and they say, oh, people, process, technology. And I throw on the last P, which is politics. From my point of view, when I approach a situation that needs to turn around, I try to understand the voices in the room and their attachment to the outcome. These are usually high stakes. They've invested not just a bunch of money in it, they've also invested their personal performance and their ego and their reputation. That's why I can never talk about my clients because, typically, they're calling me because they're up shits creek without a paddle. And they are trying to solve for the last P, the politics, more than anything else. And that's great.
But where I dive in, I dive in pretty much like an octopus. I just open wide, as wide open as I can, interview the program team, the project team. I interview all the stakeholders. I personally do a review of the process documentation that exists today and try to match that to those people that I'm talking to. I look at the data schema, the architecture. I look at the technology implementation schema. I go after the full circle, as well as what is going to be the adapt-and-adopt criteria for this thing to be successful at the end.
And then I usually come back within a week, two weeks max with my first impressions. And I try to talk to as many people as I can. It just depends on the size of the company and the scope of the effort, but it's not uncommon for me to talk to 75 to 150 people in a week, two weeks, because you need to know what they're doing. And so I just go on this giant dive in. And I typically do this by myself so that I have as much of an unbiased opinion as possible and complete this assessment.
When I am reshaping the here we are and where we can go from here conversation, here's a couple of recommendations, this is what I would do next. I do fill in whoever my sponsor is. And sometimes they're gunning for somebody and they just really want to get rid of somebody, that's often the case. And it's like they're using me as their some kind of bomb or laser beam to come in and spare them out. And if it's warranted, if that person really has just ethically and actually and technically destroyed the value of their investment, of course, that's going to come out because, as you said, the truth is good. But typically, it takes so many people to screw something up. It's not a one-person show.
And so we go into let's talk about solutions, and let's talk about the weak points of what's happened here and what we need to invest in to get back to your solutions and get this off the skids. On rare occasion, I recommend to scrap the whole thing. But it's very rare. Almost everything can be salvaged. It's just a matter of finding the people who are willing to let their egos go and finding the right notes for a company to reframe it so that it regains its enthusiasm and people aren't just rolling their eyes one more time.
Dayle Hall:
Yeah. And I think one of the things that I think is the hardest thing to do is to look at a process or a tool you've implemented. You can go through the do we set it up the right way, are we pulling the right data, but the change management for the people, I think, is always the hardest. Not just the change in the process that they've tried to do, but particularly then if things are not working, you have to go through a second layer of change management, which is potentially getting people to understand we might need to go back before we can go forward, there may need to be some change of personnel or attitudes or anything like that.
What have you seen? Because I look at these podcasts as like, I hope someone is out there going through a challenge and go, I haven't thought of that. That's the whole point of these things, right? If someone's out there, maybe they're in the same position, maybe they're going through the this isn't working or whatever, and maybe it's something within the people in the organization, how do you help organizations to address this in the right way to really have a better change management with the people, not just the process? What have you seen that works?
Carrie Nedrow:
Wow, that's a great question because I tailor things to the temperature and the trauma of organization. Typically, there's a go at it a couple of ways. One way is to assess how much trust there is that the organization is actually going to do what they say they're going to do. If something has been brought up multiple times and it's failed multiple times, there's usually very little trust that such a thing will occur. That means you need more than talking heads from the senior leadership to say it's going to happen because, look, we want to get our folks involved, yes, it's going to happen this time, we're going to do it. If they don't believe you, you need to go at it a different way. So that's one thing, is let's get back to trust and beliefs.
The second thing is back to classic change management. What's in it for me? What do I get out of going at this one more time? And make it personal. And then really talk about what is happening in the industry, what's happening with your competitors, what will happen if you don't do this. Because our world revolves around carrots and sticks, and we've known this for as long as humans realize they needed to eat, what does it take to get food in your belly. You have to go back to the caveman instincts. And that's okay. And I don't mean to say that people are distant relatives back from a million years ago, but we still carry within us a deep need for security and confidence and trust. And if you can address those in a way that is relatable and honest, you will have a much better outcome.
The other thing that I really recommend, and often I won't take a job if they won't sign up for it, is they must have someone whose neck is in the noose on it and must be their living example of this thing going forward. Let's say you have a security program, you've invested a bunch of time and money and then peters out for something or another, you need a face up there that is responsible every single day for the performance of that outcome. Will they get fired if it goes sideways? Maybe. But that needs to be the clear and present danger for that one individual. When you don't have that level of investment- and it doesn't have to be your CEO or anything. That's not what I'm talking about. It has to be someone that people identify with the success of this thing, and they trust that person, and they also want to help that person.
Dayle Hall:
It’s an interesting way of looking at it. Do all leadership teams who are hoping that these projects kick off okay, do all people get that there's going to be a change? Do they give it the right time of effort? Do they give their time to make sure that they're communicating in some of these things? Or is that, again, something that you generally go in and have to advise them, like these are the things they need to hear from you, you need to communicate why this is happening, you need to make sure you call out success? Do they get it? Or is that something that they also need help with?
Carrie Nedrow:
It really comes down to the maturity of the senior leadership communications person. If you have a really good internal employee communications person who is feeding sound bites to their CEO and their senior leadership team, and this is it, cool beans, probably you've got a good lever there, and pieces of that are probably well in place. Some more mature organizations actually have a change management team, and they're amazing. But they're still too thin, and there's not enough of them, and sometimes things fall through the cracks.
If you're a program manager, let's say, program managers get saddled with change management a lot, and they often aren't fully equipped to do it. They are getting more training. But when you are trying, let's say, to automate some process that goes across a couple of organizations, so let's say you're trying to automate accounts receivables from a direct customer process, there's this thing where you have to have your customer delight, your customer service organization, interact with them, those systems have to interact with a third-party bank. That third-party bank has to interact with your finance systems. Your finance systems have to interact with your balance sheet and how you’re reporting, right? That's one process, simplest as it can be. Let's pretend there's only one system for each one of those organizations.
And I went down this pathway. Because your program manager is saying, I really want to automate this so that when a customer inquires about a bill and then they agree to pay that bill, that it will be automated and it'll flow all the way to the end, it'll hit the balance sheet at the end. They may not realize that there's somebody in finance that is really tied to a very complicated reporting process, but the program manager may be thinking it's all on the front end between their customer service and accounts receivable, payment processing into the bank, not realizing that really all the works on the back end. And this is where the real drama is for the person.
When they're setting the expectation that we're just going to automate the universe and it's going to be great, this one little process, right, and it sounds so fabulous, what that program manager may not fully understand and appreciate until they get deeper into it now they're up to their neck and some tricky soup that they don't know how to swim out of, they pissed off the whole finance team. And now they have to regain the trust of that finance team. It's a common mistake. Part of that is because our program managers out there in the world, and god bless them, they walk on water, is they’re only human, and often they don't have the time and resources to do that full end-to-end look and do a full stakeholder review prior to launching. That setup is something that often goes missing. But as a program manager, I would tell people, you need to do your stakeholder analysis all the time. All the time. Don't ever stop working with your stakeholders. It is a very time consuming process. It is the difference between winning and the difference between suffering.
Dayle Hall:
Yeah, and that's interesting. When we talked about managing the change, communicating, so what are the things that you've seen companies be successful around, let's call it, connecting the dots? Now that could be we're putting councils together. It could be just the communication. It could be meetings. It could be other areas. What have you seen that's been successful for these companies around connecting all the critical pieces on a process improvement, whether it's through implementing a new tool? Because, generally, these processes now affect multiple parts of the organization. So what are the couple of best practices, like this is what I've seen of a company that's been really successful?
Carrie Nedrow:
I think the biggest successes are the ones that really invest in identifying the levels of their stakeholders in the program. Here's our senior leadership team, how much skin in the game do they have? Not much, but they wanted to work. This is our maybe director, mid-senior level group. It's their departments that have to do the work. It means they're not doing other work. They're invested. Then you got the people doing the work, right? So you need to have operating mechanisms that support those three tiers and provide consistent transparency both up and down that organizational structure.
Look, technology's technology. You can fix anything, almost anything, in the land of coding as long as you haven't picked the wrong tool for the job, right? If you've got machine learning tool that you've implemented, but what you really need is just a parser, guess what, you got the wrong tool. So darn it, somebody made a mistake. But that's so rare. That's so rare because the ability to do research and find the right vendors- even if it's being built in-house, usually, the technology is not the problem. It's almost always the facilitation of what is going well and what's not going well and getting the airtime between groups to solve those problems in a way that is actionable.
Sometimes you get companies that never make decisions. And so they spend. Sometimes your companies that don't have an appreciation for what those problems mean. And so they just tell you to do whatever. And so a group is, maybe they're gun shy because they've gotten hung out to dry sometime in their lives. So people carry baggage with them, right? And so they behave in the way that they know that will be rewarded or is less risky for them. You got to appreciate where people are coming from. But the ones that are most successful have addressed the hierarchy of information and create a really good transparent flow.
Dayle Hall:
You’ve talked earlier about larger companies or smaller companies. Does that vary? Does that- those principles still apply for smaller companies that maybe don't have the same types of resource, they've got a bunch of other priorities, they don't have the massive IT organization, maybe they don't really work with, call it, a business analyst or a process nerd like yourself, they don't usually get that time? Is the principle still the same? Or is it just harder to execute? What are some of the differences between some of the larger enterprises and how smaller, medium-sized organizations have to do this?
Carrie Nedrow:
Yeah, there's beauty in the smaller organizations. I'm working with a small company right now. I mean, they're an early-stage startup, For People. And they're outsourcing everything on the planet because For People can only do so much. I would say that the same exact principles apply, because they aren't just For People, they have a full-on agency, they have outsourced IT, they have outsourced marketing, there's all this stuff going on, right? And they have a platform that's outsourced. So they are phenomenal at really keeping their operating rhythms in place.
Once a week, they meet with each of their major suppliers. Once a month, they make them all get together. Every other week, the For People prioritize what needs to happen for the next two weeks. And because they're small, they can just IM each other all day long. But it's the same principles. Where the difference is how much can you do. If you have limited resources, which every company has limited resources, but it's different. If you have 500 people on your technology and business analysis and program management offices, chances are you can do a lot more than someone who's outsourcing and has a very limited budget, or only has two engineers and no program management and one product manager and everybody's wearing all the hats.
Dayle Hall:
Yeah, I think in that scenario, it may be even easier to drive change because there's less people to communicate with.
Carrie Nedrow:
Everybody's invested in that small company. Nobody's not invested. Everyone is invested in every single outcome, because at that stage of a small company, you can't move anything without everyone being impacted. So they're either specifically or innately aware of everything that's going on.
Dayle Hall:
Yeah. So I think that's the positive side. You can act faster. People want to be involved because there's less people and they do have more skin in the game. The scope of the change can just take a little bit longer and everything that they have to get through because they're wearing three hats, four hats, doing that kind of role.
I know this is more like a personal question. Do you prefer working with larger organizations, or the smaller ones based on some of the work that you have to do? What gives you satisfaction in advising these companies?
Carrie Nedrow:
The truth is I like the meaty problem. I don't really care if it's a humongous company or a smaller company. Actually, when it comes to problem solving, I just love the work. The smaller ones, you're making a long-lasting impact because they're so much smaller and they're so early in their development that you got to get it right. You have to nail it. You can't mess it up because it's going to stick with them for a long time.
The name of my consulting company is Enduring Solutions. And the reason I named it that is I really think it's important not to have quick-hit solutions, that if you're going to invest in someone to come in and retool your people, your process, your technology, and your politics, you want it done to be sustainable. You don't want it to be seasonal.
For a small company, I am helping them understand here are the choices you're making, and this is what it will look like as you evolve down the road. For a larger company, I love working with them because they've already invested in so much of their customer experiences, their employee experiences. They've already put a bunch of blood, sweat, and tears into it. And they're invested in getting it right. Usually, yeah, sure, there's some disaster looming on the side somewhere, but they have all the pieces in place for you to just massage what's there and get it to use what's already good in the company and bring that goodness that's in the company up to a higher level. So that's really fun, too.
Dayle Hall:
Yeah, I like that. Look, it's been a great conversation. I think what I like to wrap up with, what I think is interesting, you have described yourself as a process nerd and you must have hundreds of interactions with clients and deal with a bunch of different tools and processes and so on. I think one of the things that I'd like to hear from you given all this experience, as we see what's ahead of us with some of the generative AI technologies, is there something that specifically you're excited about or maybe even worried about that you're going to be helping to fix processes in the future with technologies that are just coming to market now? Is there something that you're excited about or that you have a caution about that you're thinking about moving forward?
Carrie Nedrow:
I can't tell you how excited I am about some of our AI tools that, frankly, have been out there for a while that are now a little more readily consumable. The thing I would encourage companies to do is champion your internal employees to find some creative use for those tools, but have guardrails at every single stop and evaluate the outcomes. Don't accept anything that comes out of that until you have well established, and I'm talking six months to a year, that the outcomes meet your expectations or exceed your expectations.
Having worked in medical clinical settings, in manufacturing, and software development, seeing some of these tools be experimented with and played with, there are massive mistakes that come out of it. And it's not that it's wrong. It just hasn't learned everything yet. And what's wrong is that we believe it. You could stick anything in there and anything will come out, and maybe it's relevant, maybe it's not. You should be finding ways to fact check. You should be finding ways to do QA. You should not assume that it actually works until you're pretty darn sure it works.
Dayle Hall:
Yeah. No, I like that. And we're the same here. It's an exciting time. It always feels like technology is definitely moving forward. But what I really liked about this podcast is we haven't really delved into as much on the process side on how to make sure these projects, regardless of what the technology is, what makes them successful, and the people side and managing that to make sure that you're bringing them along for the ride because, ultimately, the company is only as successful as the people that are in there doing the organization. I don't think generative AI is going to change that anytime soon. So, Carrie, I really appreciate your time.
Carrie Nedrow:
Dayle, I've enjoyed this so much. I hope you have a great weekend. Here it is on a Friday.
Dayle Hall:
Yes, absolutely. We like Friday podcasts. So Carrie, thank you so much for being part of the Automating the Enterprise podcast. To everyone out there, thanks for sticking with us today, and we'll see you on the next episode.
You've been listening to Automating the Enterprise. Each episode aims to shine a light on the real-life challenges and opportunities that companies face on the road to the workplace of the future.