Data Driven Leadership

De-Risking Data Transformation

Guest: Susan Schramm, Founder, Go To Market Impact

You can’t avoid risks. You can only plan for them. This is one of Susan Schramm’s cornerstone truths when it comes to data transformation. In this episode, you’ll hear an excerpt from a webinar where Susan shares why data modernization projects fail and how to de-risk them. She also warns against common mistakes she’s seen clients make when leading a data transformation.

Listen On



You can’t avoid risks. You can only plan for them.

This is one of Susan Schramm’s cornerstone truths when it comes to data transformation. As Founder of Go To Market Impact, a firm specializing in de-risking data strategy, her advice is well worth taking.

In this episode, you’ll hear an excerpt from a webinar where Susan shares why data modernization projects fail and how to de-risk them. She also warns against common mistakes she’s seen clients make when leading a data transformation.

Kyle Roberts, Client Success Leader at Resultant, also joins us to help with our Solution on the Spot. He emphasizes the importance of identifying a clear goal, generating meaningful data, and translating technical work into actionable insights.

In this episode, you will learn:

  • The importance of starting with a business outcome
  • How to de-risk your data transformation
  • Project-dooming mistakes to avoid

In this podcast:

  • [01:00-03:30] Starting with the end in mind
  • [03:30-08:00] Generating meaningful data
  • [08:00-10:00] Uncovering opportunities to change behavior
  • [10:00-17:00] Considering practical significance, not just statistical significance
  • [17:00-20:00] Meeting your users where they’re at
  • [20:00-23:45] Managing the two types of risk
  • [23:45-28:30] Addressing the technology adoption curve
  • [28:30-31:00] How to think about risks
  • [31:00-34:00] Common mistakes to avoid
  • [34:00-38:00] Planning to de-risk
  • [38:00-40:45] Identifying and aligning the right people
  • [40:45-41:05] Measuring the success of a de-risking program

Our Guest

Susan Schramm

Susan Schramm

Follow On  |  

Susan Schramm is the founder of Go to Market Impact LLC, a business consultancy that helps CEOs, boards, and leadership teams driving high-stakes strategies get faster results. Under her guidance, hundreds of leaders in business, nonprofit, and faith-based organizations have introduced new offerings and accelerated impact for customers, employees, and communities.

Recent Go to Market Impact engagements include optimizing diverse supply chains, accelerating public safety innovation, implementing secure smart cities, readying the workforce of the future, and equipping purpose-led entrepreneurs around the globe to build sustainable businesses and create jobs.

Susan has held executive roles with IBM, Siemens, Nokia, and Viavi Solutions. She currently serves as an Executive Principal with on-demand solutions group Cygnus Sprints, on advisory boards for 5G solutions provider COMSovereign and analytics firm Cognitient, and as vice-chair of the board of Nehemiah Project International Ministries.

Susan frequently serves as a keynote speaker and guest on podcasts for industry and educational programs.


Jess Carter: The power of data is undeniable. And unharnessed, it's nothing but chaos.

Speaker 2: The amount of data was crazy.

Speaker 3: Can I trust it?

Susan Schramm: You will waste money.

Speaker 5: Held together with duct tape.

Speaker 6: Doomed to failure.

Jess Carter: This season, we're solving problems in real time to reveal the art of the possible. Making data your ally. Using it to lead with confidence and clarity. Helping communities and people thrive. This is Data Driven Leadership, a show by Resultant.

I am your host, Jess Carter. And on this episode of Data Driven Leadership, we're going to dive into data transformation projects. Specifically, we're going to explore why someone might take on a data transformation project and what their outcomes might be. Ultimately, we want to explore what the value of those projects are.

We're going to get to some of this information in the first few minutes of the show. And then afterwards, we've got two experts who will take a real live example and break it down for us. To help me solution on the spot is Kyle Roberts, client success leader at Resultant. Hey, Kyle. How are you?

Kyle Roberts: Hey, doing very well. Thanks, Jess.

Jess Carter: Good. Well, hey, thanks for joining me for solutioning on the Spot. Really quickly before we get into our scenario, can you just share a little bit about why Kyle and why data transformation?

Kyle Roberts: It's funny because I come into the data transformation from actually an academic background. I was a professor of statistics for a while and then did research at another institution. But now that I'm in consulting, it's more of the less theoretical, more boots on the ground and which just fits me a lot better.

And so, in helping clients do data transformation projects, this is something that I'm like, "Okay, we're actually making a difference," instead of theorizing about something that only professors care about, which is just the worst ever. And so, it's funny, I was a professor and I didn't even like who I was at that time. But like, "I want to actually make a difference now." We're moving into that then.

Jess Carter: That's awesome. I'm pretty interested in both the academia and the pragmatic realities of data. And both have their advantages, but somewhere in the middle seems to be a lot of fun.

Kyle Roberts: 100%. Well, and it's funny too because I would use all these data sets with my students. And I would always give them like, "Hey, here's how it is." And then you get into the real world and you go, "No data is this clean." I mean it just is totally, completely impractical. But now, turning to the topic today of training the data transformation. I mean, everybody wants to have the datasets as clean as you see in an academic setting, but nobody does.

And so, when we start thinking about, all right, these big data transformation projects. That's usually where we start with is, okay, A, how bad is your data? And B, what do you want to do with it? What is the end goal in mind? And so, I think about these, we get contacted by a company and they say, "All right. We know we need to do data transformation." And then generally the response is, "Well, why?" "Because everybody else is doing it." "Okay. That's a great reason. We should jump off that cliff too."

Just the idea they're not coming through with, "All right. Here's why we know we need to do that." I think that's why most people when they think about data transformation, they just go, "Okay, I just know I need to do something with my data." Not, "I need to start with the end in mind of I need to get to a point where it's usable for this thing."

And generally, they think of it, "Let's boil the ocean and see what comes out." What comes out is dead fish. We don't need to boil the ocean. What we need to do is actually start with a plan of, "Okay, what is it we want do with our data?" And then we start transforming it to answer that question. And so many companies just don't start with that end in mind. There's, "All right. We're going to start a new initiative because we have some budget and we feel like we need to start doing stuff with our data." Oh, that's a terrible idea. Terrible idea.

Jess Carter: I couldn't agree more. And well, our solution on the spot's going to be just perfectly aligned to that. We'll just get into it. Okay. Does that sound good?

Kyle Roberts: Okay.

Jess Carter: In this scenario, let's say that you and I are partners. And we're walking into a financial institution that wants to take on a data transformation. And the reality too is one of the first things I'm going to ask, and I'd love for you to expound on this too, is what does a data transformation mean to them? When we say data transformation, what do we even mean?

Kyle Roberts: It would seem that the data that's coming in from your source systems or wherever data is generated, it may not necessarily be useful. Okay. Let's think about that in terms of if we have a cash register receipt, that's a bill of sale that's actually happening at the point where data's being generated. But all you're seeing is on a cash register receipt is what was sold for how much, but we don't know the profit on that. We may not be able to pull out of that how many we've sold that month or things like that.

And so really, when you're doing those data transformation projects, what you're trying to do is integrate a lot of your data so that you get better meaning out of it. And so I may know something, and again too, think about maybe I have a camera that's letting me watch the ethnicity of each person that is checking out. Now I know, okay, this ethnicity is tending to buy more of this thing or more of that thing. And if I'm in a certain area, I may want to target my business towards that.

Or we may even look at and go, "Okay, my cash register receipt is only telling me that they brought bread. It's not telling me what type of bread." And so now I look at it and go, "Okay, now I need to integrate with my other pieces of data in order to transform that what's happening at the cash register into something that's meaningful and actionable for me. That now I can actually make a plan for my business and how it change my business as a result of that."

And so, that's what we're doing with data transformation is we're trying to take more meaning and put it into the things that are being generated out of our source systems.

Jess Carter: I love that definition. Taking more meaning and basically pulling our data together to generate that, to give us insights to run our business. I think that's an amazing... Because that applies to both private and public sector. Because most of my data transformation works has been in the public sector. And that's not a financial institution.

But that would've been one of my conversation topics with our bank too is what data are you using to drive your business? And what data do you wish you had that you don't? And what data do you might perceive that you're gathering and you're not even sure if it's valuable? Because I think it's always interesting to see what they think there. And oftentimes, we find a way to leverage data that's being collected that they were casting aside, they didn't know it was valuable. And generate some meaningful insights into their business.

Kyle Roberts: I love that too. When they ask a question and they, "Okay, we can't know the answer to this." We always start with that question of what is the thing that if you knew today would make your business better in 90 days? That's a good question to start with.

And then so you start with the question in mind and then you go, "Okay, do we have that data?" And a lot of times, you're right, Jess. That they didn't even know they were collecting it. And especially it seems like the higher up you go up the ladder, when you get to the C-suite, they didn't know that all of this data is being archived in a really old DB2 system or some data stage or something crazy like that, super old that nobody uses anymore.

And so everybody forgot about it because it's connected to some printer that's got a thing that goes, "Eh, eh," while it's printing out because they think, "Well, it's really old." It may still be really useful data, but they didn't know they were collecting it. And now you can integrate that into all the other pieces of data that you had. Now you've got this great transformation project where you have meaningful insights that you can derive from that.

Jess Carter: Yeah, if you start from the point that you've laid out for them, which I really appreciate that point of how do we generate more meaningful data or insights? The other thing that we can do is, I don't know about your experience, but often, when we're taking on a data transformation, in my experience, it yields some system changes too. It often requires either upgrades or changes or replacements of systems or interoperability enhancements.

And what I've noticed is they can approach those differently too. Often, if you're not approaching a system modernization or interoperability enhancements through the lens of what's meaningful to my business, you lose something. You either lose an opportunity or you lose critical data that no one knew was critical because you weren't approaching the work from a, "Is this meaningful to my business? Can it drive meaningful insights?" Perspective.

Kyle Roberts: And I love that you pointed out there that by doing these data transformation projects, we may point out something that they're doing incorrectly. Here's a great example of this. That educational institutions, especially higher ed, when they have on a scale of zero to 100, how well a person is doing in a single course. And then they transformed that to A, B, C, D, or F.

Now we have five categories. And so when they did that transformation, then they saved it as that. They didn't save it on zero to 100, they saved it on that. Now, the difference between an 89 and a 90 is the same as the difference between an 80 and 100. And so what you did is now your data scientist, who actually knows what they're doing would come back in and go, "Oh, you made my job so much harder. Because now I'm trying to predict categories instead of predicting this continuous scale, which is much easier to do and provides a lot more variability."

And again, too, the person that made a 89.4 and a 89.5, one of them gets rounded up and the other one doesn't. And in fact they're really, really, really close to each other, much more closer than a 79.5 and a 100 are. But on our scale, we now just ruined it. And so, in these transformation projects, there's lots of opportunities to find, "Oh, we're doing this entirely wrong," and change behavior from there forward.

Jess Carter: Okay, I want to tee you up for a fun pet peeve conversation and I'm going to anticipate one.

Kyle Roberts: Oh, I have lots of them, so we're going to be seven hours or so. Yeah.

Jess Carter: Let me ask you this. In my experience, a lot of times clients will get really excited about their data transformation and they anticipate that they need, before they even pull us in, really smart data scientists. They'll grab a data scientist, they'll hire them, they'll staff them. We get in. And more often than not, it's just an interesting experience to appreciate what the market has brought to bear in a data scientist compared to what our unique group sees as data science. Have you had that experience and could you elaborate more on what that feels and looks like?

Kyle Roberts: So much so. And in fact, so I mean it's just almost like when you're going now, "Wow. We're so inundated with data scientist," that the term that you just used, smart data science is an oxymoron now. That it's just people are just not smart. And I think it's partly our own fault is that there's no standards for data science. There's no governing body that says, "Yes, this is a data scientist. No, this isn't."

I mean, someone who's taken a beginner class on Tableau could go, "I'm a data scientist." And there's no one that could say, "Yeah, you are, you're not." Those data scientists who really are worth their salt. I think, what was it? In 2018, I remember reading an article, I think it was in Bloomberg, I'm not sure. But I remember reading article and the lead on the article was Data Science is the New Sexy. I think we've swung back now where everyone is going, "You know what? First, things are really hard to predict. It's hard predicting stuff."

In fact, I had a client just Friday that we literally turned down work because I go, "Look, I've spent probably 20 hours with your data. You can't predict what it is you want to predict with your data." And they were like, "Wait, you're turning down work with us?" And I go, "No, I'm 100... I can't in good conscience spend your money. You can't predict what it is. It's too difficult to do with the data that you have."

The problem is now is that 15 years ago you couldn't be a data scientist unless you really understood statistical models. Now you can because software is so great. You just throw your data in machine learning, every software out there will give you a model. What it won't tell you is if your model's any good.

And so, that is a huge pet peeve of mine is that we'll come into a client and they go, "Hey, look at our data science model." Oh, it's terrible. I mean I'll see it and I'll go, "You're not predicting anything. You might as well take a dart, throw it against the wall and draw a target around it. I think you're going to have better luck." That's a huge pet peeve of mine.

Jess Carter: Me too. And as somebody who didn't know, I wasn't into a statistical modeling before I jumped into my career in data and tech. But what I started to realize that in the public sector, I would observe people making directional insights around data sets. And I could at least, from an outsider's perspective, ask questions and say, "Well, how large was the sample size? And how broad was it? And how deep was your analysis? Or what's the longitudinal data you used?"

And I started to appreciate at least around the data that, to your point, there weren't a lot of clear data scientists who appreciated either creating their own models, testing the fit of their model, or understanding the risk of generating insights that would seem right based on the dataset that they've leveraged, but be a really high risk for the organization to go make changes based off of.

And so they were excited that they did some math. But it was like this deep, deep rich appreciation for, if the math is sound statistically significant and so directionally appropriate for the organization to behave off of that, it would be meaningful and supportive. But all the time, we just have clients who think it's meaningful and go grab a data scientist full time.

The other thing is it's like a salesman operating alone. It gets a little lonely for a data scientist to go work and do this work on their own. Often I've found that it's really meaningful when they have a team that can help them and that can think things through with them at the statistical mathematical level that they're used to operating off of. When I find a great data scientist, I find more than one. Right.

Kyle Roberts: And it's funny because you hit both ends of the spectrum there with those comments. The first is this whole... I mean there's a great article by a guy named Jacob Cohen who wrote, "The earth is round, p is less than 0.05." He's making fun of all statisticians and he is saying, "We sacrifice everything on the alter of is it statistically significant?"

And even there was a great article too, by Ross now and Rosenthal wrote one. And they go and in it, they said, "Surely God loves the 0.06 nearly as much as the 0.05." And again, they were just poking fun at everybody. But it was a great poke fun because we all realize that. That we go, we're treating this as a pregnancy test and we're not asking the question that you just said. Even if you're statistically significant, are you practically?

If you put enough people in your data set, you're always going to find statistical significance. But your practical significance may be less than zero. I mean it may be none at all. But then again, the opposite side of that is with a client one time we were talking. And I go, "How many people were in your model that you originally did to do this?" And they go, "42." And I go, "Okay, that works for Douglas Adams and writing novels, but it's not going to work for you. It's not the perfect number here. It's not the answer to all civilization."

And so getting that, pulling that back out and going, "Okay, do we have enough people?" And then other times going, "Do we have way too many? We really need to look at practical significance. You said it perfect. Both of those things are important.

Jess Carter: I can tell that we could talk about data transformations all day.

Kyle Roberts: We could. And pet peeves.

Jess Carter: And pet peeves, that we could get on. Is there one last point or one major thing that you'd share in a quick solution on the spot saying, "Hey, don't forget this. Don't miss this. Or Hey, I got one more pet pee for you"?

Kyle Roberts: I like the don't miss this. Okay, let's do that one. That was one of my favorites. Okay. We do these data transformation projects and then we make really great dashboards from them. And then we even put statistical models into predict things in the future. My major professor, when I was at Texas A&M that would work on my dissertation. She was brilliant because when I wrote the dissertation, she handed it back to me and she goes, "I stopped reading after chapter two."

And I'm like, "But chapter three is where it got really interesting." She goes, "Yes, try it again. Explain it to me like I'm your grandmother." "Oh, you're right. I was doing things that I understood but that no one else got." And so, that affected me for the rest of my life. Now, when I'm with a client, I'm always thinking, "Okay, who is going to use this and can I present it in certain way that they can actually make a decision with the transformation that I'm doing?"

And if they can't, then I totally missed it. I totally missed it. And so there's part of that, when we're doing discovery, I'm asking questions about, "Hey, how much do you understand and what do you look at?" And things like that. And I even had one client one time that was like, "I don't care what your dashboard looks like. At the end of the day, print it out on piece of paper and put it on my desk."

And I'm like, "Okay. It doesn't need to be interactive. That's what is going to be actionable for them." If I can't translate all this work that we did into something like that, then it's not even worth doing. Or I'm not worth my salt. I've got to get to the end user.

Jess Carter: You're just making me smile and look back at my career at the number of times that the operation of an organization wanted dashboards, but the executive leaders were like, "Print them. Bring them into the meeting and lay them on my desk. I don't want to look at a computer screen." And if we weren't going to do that, what was the point? It can feel a little bit... You have this moment where you're like, "But it's a cool, sexy dashboard." And then there's this moment of, "That is what they need to operate. We have to meet them where they're at. The data has to meet them." I love that perspective, Kyle. That's awesome.

Kyle Roberts: Yeah. It doesn't matter that you did some multi-level Cholesky decomposition with zipper estimation. Nobody cares. Give them something they can use. I mean, yes, maybe when you're at your stat conference, everybody go, "Oh, that was really clever." But nobody cares. Can they make action on it?

Jess Carter: That is so cool. Kyle, thank you so much for joining me for Solution on the spot. This is awesome.

Kyle Roberts: Thank you. This has been a blast. We'll have to soapbox about more things in the future.

Jess Carter: Happy to. The conversation I just had with Kyle relates to the deep dive you're going to hear next. And you're going to hear things about how to avoid and accommodate for mistakes or severe risk in a data transformation. These are little nuggets of gold around if you're thinking about taking on a data transformation in your organization or in your department, really important chasms to make sure you don't fall into.

Now for the deep dive on migrating risks of data transformation projects. The two experts you're going to hear from are Dave Haas, business development executive, and Susan Schramm, founder of Go To Market Impact. In their conversation, Susan shares why data modernization projects fail and how we can be de-risking them.

Susan Schramm: Let's just start with the word risk because everything you just said sounds like challenge. I'm going to have a little bit of audience participation here. This is an emotional subject actually, because at the end of the day, risk is not a technology. I mean technology has risk, but the emotional implications of risk can be high. Type into the chat the word that comes to mind when you think of that risk, what word comes to mind? Word association.

Dave Haas: I mean, right off the bat, while people are chatting in, I'll throw a couple out. I think expensive. I think hazardous, I think danger, all words that come to mind right out of the gate.

Susan Schramm: Oh, wow. And look, the security, if I wasted money, unknown stress. That's [inaudible 00:20:07].

Dave Haas: Stress is a good one.

Susan Schramm: Exactly.

Dave Haas: Let's actually write that one down.

Susan Schramm: That's exactly true. What's interesting about this, let's tackle the subject of risk proactively. Because if you take one idea away from today, my point of view is that risk is not something you wait for, it's something you proactively plan for. And that there's two sides of risk. Okay.

One is the loss of assets, and that's where you talked about data, you talked about cost, you talk about loss of life in some applications that are really mission critical. I mean there's a loss of value because of things that you have to protect. And so, when you're managing risk of loss of value, it's all about protecting and putting fences around things and planning for that.

But there's another part of risk that people often miss planning for, which is creating value. And this is windows of opportunity. This is creating risk reward. This is ROI. Creating value is the right hand side. And the points that I think you think about in that are going to be different. Because think of a fireman walking into a fire. He's taking risk, but there's a reward at the other end.

Think of all the people who did research and spent a lot of money developing new drugs to allow us to protect the world from COVID. Think about CEOs who place a flag in a new market and they're planning a lot of money, time and energy, and they want to create return. And that is a new market risk.

Both sides of this are important to plan for because the what you do when you're protecting value tends to be more cautious. Everything about creating value, a lot of times is windows of opportunity. It's speed to adoption. In that sense, there's a element of not just slowing down to protect, but speeding up and being very aware of what the opportunities are that might pass you by.

And by the way, that's where most companies, when there's been a disruption in the world of risk management, because so much, it slowed down organizations. There was another cover business review article about essentially the stick, the lack of agility of companies because of risk. Well, what they're talking about is risk of loss, not risk of creating value. In which case, it's speeding up the company to seize that risk.

Dave Haas: Well, one of the things, Susan, that I always associate or started to associate with any kind of risk or change in a market is the classic technology adoption curve, Geoffrey Moore's curve. And I know all of you on here probably have seen it, but I'll just draw it again. You've got your innovators on this side and on the left side, you've got your laggards. Folks don't want to change on this side. And then, on this side right here, you have this area that he refers to as the chasm. Talk to us a little bit about this, Susan. What do you find?

Susan Schramm: Yeah, and it's interesting. Every one of you, if you've been involved in any project, and that sounds like you've been involved in lots, you know that behavioral issue of... This really applies by the way. Moore was really talking about scenarios where people had to do something differently. If your behavior doesn't have to change, this doesn't necessarily apply. But if it does have to change, if processes have to change, if your knowledge of how you apply information has to change, this is really appropriate.

The deal is that first group is actually fun and easy and loud and exciting. But then there's this like, "Ugh, oh my gosh." Most of your return on investment is on the right when you get people repeatedly growing the market and doing more of it. That's where your profit usually is. You don't make money on the left hand side, you make money on the right hand side. Or if it's a social impact, that first is your proof points, but the right is when you actually create broad impact.
He proposes to create what he calls a bandwagon effect, the idea of leveraging each group along the way to persuade the next one. Essentially, you get momentum and cross the chasm by intentionally knowing this is going to happen and intentionally planning for people in your early adopters who can relate to the people on the right in different ways. Whether it's the same industry or the same demographic or whatever.

They may be early adopters by behavior, but when it comes to helping them help you get to the right, they've got some way to relate to that next group of people. And you can leverage that and create momentum. That's all about the technology, looking at the technology adoption. I'd like to suggest when HBR talked about the problem, they talked about something called culture. Culture is primarily made of humans. Why don't we look at the human adoption curve?
And this was really expanded by behavioral scientist, William Bridges. He talked about it in a book called Transition. Where he talked about how every human, when they confront change, has to deal with this valley that they're going into. And change is something that happens to you, even if you don't agree with it. It could be a reorganization, it could be a new regulation, it could be an introduction of new tools and processes to leverage data in new ways. It could be any of those things. Those are changes.

Transition on the other hand is the journey we go through in our minds as we deal with that change. Change can happen really quick, but transition takes more time. And Bridges talks about three phases. The first is letting go of the old. They fear what they don't understand. There's risk that they're going to look stupid, whether they're angry or frustrated or embarrassed. There's this, "I don't want to go to this new space."

Then interestingly enough, there's a neutral zone. And actually, that's the first point where they might be able to actually hear you a little bit because that's when they're actually maybe in a model where they acknowledge that this new thing, this new system, this new way of working is going to come. They still may be skeptical, but at least they're acknowledging it and evaluating it. The last step is what would be, for all of us, a new beginning. When you are ready just to listen and build skills and work in a new way.

At that point, emotionally you tend to feel relief because you've gone past that part in the middle. But at that point is when you really can create vast adoption. The Moore's adoption curve is all about the technology itself being adopted, but inside that, when you think of the people you need to do something new, they're all going through that human adoption curve.

Even the early adopters have to go through this. They just are faster to get through that transition into new. Whereas other folks take a lot longer on the letting go and then you go through it. Knowing that about these humans you have to deal with is really important to think through.

Dave Haas: Well, Susan, here's a term that you use. It's called de-risking. You talked to us a little bit about what de-risking is all about.

Susan Schramm: Yeah, my point of view is strong is that you really don't have a choice. If you want to create a result, you have to plan for the risk, as I said. And so, some people talk about de-risking in terms of eliminating risk. That is not possible. Risk is an intentional and systematic process to identify and optimize risk in order to achieve a desired outcome and enable you to lead with more confidence.

But it's the idea of intentional. The word systematic is really key here. I find a lot of organizations don't have a very systematic approach. Even if they have a decent new product introduction process, they upfront will do a little bit of a SWOT analysis kind of thing, but they don't have a rhythm of reviewing the risks. And often, it's dealt with ad hoc or it's late. Risk tends to creep up on people and surprises.

And then culturally, many leaders don't even like to talk about risk. Somehow they feel like if you're going to talk about risk, you're not bought in. Instead, if you have that kind of environment, where a leader just like, "Hey, listen, get on the train. We're leaving. If you're not part of this, you're not involved." Well, people still feel the risk and they worry in their heads back to their point about stress.

That's where anxiety happens. Because you're sitting there worried that something's going to happen either to you, to your organization, to your project, to your company, to the users. And there's no plan to address it. That's a really important part. And the only thing is you're not going to eliminate risk, you're going to optimize it.

That means your process has to regularly review it and see not only how are you protecting, but how are you quickly scanning the environment to make sure that you're going to be able to move quickly if things change. It's not scary, it's something to be planned for by identify and mitigating risk, what optimizing risk to make sure that you're able to see windows of opportunity.

Dave Haas: Susan, what are some of the common mistakes that people make when they launch new strategies?

Susan Schramm: Some of this sounds deceptively simple. But what's interesting, after all the work I've done myself in launching and all the organizations I'm doing research with and serving as clients, different sizes, different industries, nonprofit and for profit. Here's the deal. If you don't take advantage of avoiding these common mistakes, you will waste money, energy, and time.

And even if you do launch on time, you're going to get stalled and you're definitely going to be frustrated. With that, here's the first one. Focusing on what instead of why. And so, I spent most of my career in tech. And we love cool, what's our shiny solutions? We put a lot of money, time, and energy into our cool tech and analytics. But often the why is not clearly understood. And when I talk about why it means what business problem are you solving?

The business problem is not data, that's not the business problem. The business problem has to do with the outcome you're creating. According to change-point, 80% of project executives don't know how their project aligns with a company's business strategy. You got to know the strategy because as you go along, you're going to have to link these choices of tools and solutions to that strategy. And the strategy itself may have a lot of unknowns too. Being clearly tightly aligned with that.

It's also an interesting one is they know why they're doing it, but they don't know why it's important right now. What's the windows of opportunity they have to make or they'll miss it? Being really clear on why and why now is real important. When I talk about an example in the world of data transformation about what we're talking about, you'd be like, if you're a hospital, your why is to lower the number of patients that return to the hospital after they've had surgery.

Or if you're a city government and your why is to be absolutely sure you're delivering your citizens the highest level of quality. Or you're an association and your why is to deeply understand the interest and concerns of your members to ensure your research is relevant, or your policy regulation. Or if you're a trucking firm and your why is to keep your fleet on the road, serving customers, but minimizing the amount of time your drivers are having to take a time off due to maintenance.

Those are business reasons. The data transformation projects have to support it. And if you skip this, what happens is, number one, the project when you go to get people to take action, they're not bought in. They don't see it as relevant. It's a language to help your project relate to their priorities. You'll also, if you're not clear this, you don't know how to make trade-offs. If your project runs into bumps along the road because you don't know how it fundamentally relates to the business priority, you don't know how to make choices.

Lots of time and money can be spent with no outcomes and you can accidentally miss important windows of opportunity. Because you didn't know what those business implications were, you may be chugging along with your own timetable and not realize that you have just missed a big opportunity. To avoid this common mistake, ask yourself, are you personally fundamentally clear on the business purpose of your project and the problem you're solving? How will you know it's solved? And why is it important to solve it now?

Dave Haas: Susan, let's skip ahead a little bit and talk about planning to de-risk.

Susan Schramm: Okay. Here's the thing, is my guidance to people is to system start up front, either two points in time when you need to plan for risk. Number one, before you ever launch. It's like the luxurious time when you're building a house and you can sit there with the blueprints. You don't have to waste money moving walls because you can proactively see what that would do.

The other time is when you're stuck. And when people are stuck, they're scared and they're frustrated. And at that point in time, it's really important to back up. Step away from the car. Be strategic and think through what are the risks at this point and how can we mitigate them?

The one thing I would say that I don't want to miss is that when it comes to who we want to move to action, we need to anticipate the issue of the brain. Believe it or not, brain science is going to help you here. A lot of times the biggest impediment to you moving forward is all about getting people to do something new.
And what is interesting is if you were to take a picture of the brain, Dave's drawing it from the top, there's old brain and new brain they call it. Where the neocortex of the outside brain in the limbic is in the middle. And what's interesting about it is in the neocortex is where all the data and information rational thinking is, but the limbic at the base is where your fight or flight is. It's where your emotion is. And it is also where your decision making is. And this is for to protect you.

But if you don't necessarily plan for the information you're going to need to do to spur both, the outside brain to know what the information is that they need to take action on, but also why they need to, you are going to miss opportunities. And it ties back to the human adoption process. But when you go to launching intentionally thinking about what you can do to do both.

For instance, on the outside part, clearly you got a lot of data, you got a lot of good information about why the impact will be. And you can have all sorts of charts. And lots of times, your storytelling really comes into play here. Your storytelling really can help both sides of the brain of the people you're trying to identify to take action respond better to the outcomes.

Dave Haas: Talk to me about how your firm applies your systematic process to de-risk new strategies. And talk a little bit about how the process actually works and perhaps give us some of the things that we can take away.

Susan Schramm: Here's the process. It's something as it takes weeks, not years. And basically what we do is we systematically shorten the time it takes for an organization to think through these very basic ideas. But do it in a way that you're all the way along, you're documenting risks and assumptions. Because a lot of times you're making assumptions and you don't have all the information.

And so, the first part is identifying the risks. The second is preparing for how to address those risks. And the third is putting a plan in place for ongoing management. Because a lot of times this is done once. Many, many new product introduction processes do this once. But having a rhythm for scanning for change and reviewing what those risks are is part of the process. Even if you're not the boss. Even if you're not the CEO or the board or the head of the project, you can create an environment where people can talk about risk.

Here's an idea. At your next staff meeting or team meeting, play an exercise called play the movie backwards. And essentially, you take your launch and you say it's two years out and you're watching the movie how it went perfectly. And you say, "What was it?" And you play it backwards and say, "Where along the movie was there's this plot twist that we did exactly the right thing?" And I'm saying start with that because people will actually quickly go to the next part, which is, "Oh my god, there's plot twists where you don't do the right thing."

And if all you did as a group was simply put together the list of the things that come to mind that if these things don't happen, there's a consequence. You'll find they really do relate a lot to the categories that I've framed here. The second thing I think is this idea, and this is the second common mistake, is who. There is a tendency with new initiatives to focus on the regular people you think of.

Okay, we have a new opportunity for sales and so we're going to get this tool ready for the sales team. Okay, great. Who else needs to know this to be able to take action? Who else might be implicated? But what's interesting is there's a lot of unexpected people who have an impact. What I find very often is if you're launching a new way of doing business, the front line is not just the sales people pitching it. It's the accounting department.

Someone calls into accounting and they go, "I'm trying to get my bill done," or whatever. And they go, "I don't know anything. I'm a mushroom sitting here. They never tell me this stuff." The idea of channel partners and suppliers who might need to know what your strategy, is a way to frame who needs to do what to take action. That's an approach.

Dave Haas: How important is it to align the decision makers with the change makers?

Susan Schramm: It's interesting because in Built to Last, Jim Collins said that, "A new visionary strategy is 1% vision and 99% alignment." And as I say, even great strategies will fail if people don't get aligned to execute. Strategies will be excellent because people all know what's going on. How important into it is critical.

The question comes down to how do you know they're not? And that's why sometimes not talking about it is the quickest way to assure that you won't be aligned. If you're not willing to talk about the risks, you will assure that they aren't aligned. And one of the things that we do in these workshops is we actually take diverse groups of people and walk through the how, the let's play the movie backwards kind of thing, with diverse groups. And we actually create a safe place for people to talk about what they're worried about.

And in supplier diversity, it's a very important goal, but it always isn't understood when it comes to finance. "Why are we doing this if it's going to cost us more money?" Or in some cases data. I did a survey just the other day, a focus group where we had two different reactions. Executives talking about digital transformation, saying, "This is all being driven by vendors and IT departments. I don't see why this is really relevant to us."

And then you've got IT departments people frustrated because they're like, "All we want to do is do this for the business, but we can't get anybody to work with us." Well, that's a lack of alignment. And in many cases, the exercise that you need to go through is simply giving a safe place to talk about risk so it's not personal. Because a lot of people don't want to talk about it. They think they look like a Debbie Downer, or they look like they're not loyal, or they don't get the picture.

But in fact, in their mind, they're like, "This is on a path to crash." That's what I do, create a safe space for people to talk about risk and then get the different perspectives in the room. And then keep it going. Don't have that be a one time shot.

Dave Haas: Is there a way to measure the success of a de-risking program?

Susan Schramm: Well, it's interesting because here I am saying, "You need to have a way to measure the success of your initiative. How are you going to measure a business problem that you're solving?" It's interesting because the success of a de-risking has both a business and a human impact.

When it comes to the business impact, a lot of it is time to market, it's time to results, whether that's the on time, on budget metric. It could be speed to market for a new launch. One thing I found when I was on boards or in positions of funding new projects, people would come in to me and go, "We need money for this. It's so important." And all I could do in my brain is go like, "Where is it going to fail?"

If you can systematically have identified those weak points and be able to say clearly, "Here's the areas that are biggest risk and here's how we're managing it." You actually get calm and you get more people interested in funding the project.

But there's a second part, which is the people. And we talked a lot about culture and people. Some metric is employee engagement, another is retention to the team. People quit when they feel like this is going down. Getting people to stay engaged and stay with the project is easy metric.

Jess Carter: Thanks for listening, guys. I'm your host, Jess Carter. Don't forget to follow the Data Driven Leadership wherever you get your podcasts. And rate and review us, letting us know how these data topics are transforming your business. We can't wait for you to join us on the next episode.

Insights delivered to your inbox