Transcript
This has been generated by AI and optimized by a human.
Jess Carter:
The power of data is undeniable: And unharnessed, it's nothing but chaos.
Speaker 2:
The amount of data was crazy.
Speaker 3:
Can I trust it?
Speaker 4:
You will waste money.
Speaker 5:
Held together with duct tape.
Speaker 3:
Doomed to failure.
Jess Carter:
This season, we're solving problems in real time to reveal the art of the possible, making data your ally, using it to lead with confidence and clarity, helping communities and people thrive. This is Data-Driven Leadership, a show by Resultant.
Hey everyone. Welcome back to Data-Driven Leadership. One of the things I love about data science is how it's constantly evolving, not just in the tools and tech, but in the people who make it happen. It's a space where collaboration, creativity, and communication matter just as much as code. Today, we're exploring the intersection of leadership, innovation, and mentorship with someone who really brings it all together, Alexa Myers, our senior data science manager here at Resultant. Alexa's built a reputation for leading high impact AI and data initiatives, mentoring early career talent, and helping teams translate complex insights into meaningful decisions.
She's really passionate about making data approachable and using it to drive real-world change. In our conversation, we're getting into what it takes to build strong data science teams, why great data functions still matter in an AI first world, and how to communicate effectively with stakeholders at every level. We'll also talk about how AI is amplifying the human side of our work, not replacing it, and what skills matter most as the field keeps evolving. So whether you're new to data science, leading a team, or just curious about what it means to lead in a data-driven world, this episode has a ton of insight and inspiration packed in. Let's get into it.
Welcome back to Data-Driven Leadership. I'm your host, Jess Carter. Today, we have a Alexa Myers, Senior Data Science Manager here at Resultant. Let's get into it. Alexa, welcome.
Alexa Myers:
Thanks, Jess. Thanks for having me.
Jess Carter:
Yeah, I'm so glad you're here. We're going to start with some really basic stuff that is going to be unimpressive to you, but I think helpful to others. We'll consider them icebreakers. Is that deal?
Alexa Myers:
Yeah, that works.
Jess Carter:
Perfect. Okay, so there are more people than there were a year ago or three years ago who know what data science is, but there are some people in the universe that have never engaged with it, do not know what data science is. Can you take a stab at a 101 elevator-ish pitch of here is what it is?
Alexa Myers:
Now that you say it's an icebreaker, but I would actually say it's a really hard question because when people try to ask me what my job is, depending on the audience, I give a very different answer. So say at the fundamental level of two occupations where I feel like it kind of merges is a half statistician, half computer programmer, but then piecing that together, leveraging it for problem solving and being able to tell the story with data for people.
Jess Carter:
That makes sense and it's been this big boom, right? The last would you say... Is five years, right?
Alexa Myers:
Yeah.
Jess Carter:
Seven? I don't know. Okay, so last half decade that data science was at. I just remember during COVID it really kind of took off. One of the questions I have is in the market I started to see it was almost a position where you were promoted as an analyst to a scientist. I could be wrong, and you are allowed to correct me on this podcast. In my head where there is distinction that I've articulated, so I want you to be truly corrective or affirming in whatever nuance you want, a data scientist is not like, oh, you're an analyst and eventually you get promoted to a scientist. There is a level of statistician, mathematics, et cetera, that you need to be able to be the people that we work with like yourself. There are people who've worked at NASA and they have PhDs in mathematics. We are building a model that will work for your custom situation versus we are adapting a model and just feeding it data to see the results. I feel like the market calls both of those data science. Do you know what I mean?
Alexa Myers:
Yeah. It's one of those things and I think it's because the field is truly relatively new. I mean, when I was in academia for my undergrad, a data science program didn't exist because the occupation didn't exist, and when I was looking for jobs, the job data scientists didn't exist at first. I think it's one of those things that the market is still trying to figure out exactly what that means. I think we are finally in a position to your point of we are starting to see that distinction and it is truly around that innovation, building models, those custom solutions and that problem solving mindset that you have to have and truly building something innovative for whoever you are working for.
Jess Carter:
Yeah. Where I also struggle is I have, from my observation, there aren't a lot of customers or C-level executives, for instance who know to ask for science. They're asking for innovation like you just said. They're asking for outcomes that are very different, more aggressive, creative, innovative, but they don't know that science is what's required for that. Is that your experience, too?
Alexa Myers:
Yeah, and I would say that I've started to see that shift in the last five, six years. When I first started in a data scientist role, I was definitely more of a hands-on keyboard, more of that statistician, programmer. I would work with consultants. It was like this game of telephone almost. The consultant would go to the client, the client would say something, the client would go back to the consultant. The consultant would come to me, but then I would have questions. And so it was really this game of telephone at first.
Now with, I think one of the things that has really morphed like this and had the shift is the push of AI and the big AI hype that we're in. But with that has come a lot of conversations around education, stakeholder engagement, ethics, privacy, and so with that, they needed the subject matter experts in the room. And so, when we have now made this more of a communication circle, everybody's in the conversation, it is truly morphed into let's not just listen to a solution. Let's actually solve the problem. Let's think about the problem and making sure that we're choosing the best solution for it, especially around when we're saying we want AI, but we don't exactly know what we want, so let's have a really solid conversation. Let's bring our subject matter experts in the room and let's science this thing out and making sure that we're building the best thing for the customer.
Jess Carter:
In all of the chaos of the last half decade, you have science has taken off. Now AI is a hype, right? There's real concern for me as a leader to see people trying to take AI and jam everything through it without the science. There's real fear I have that you're going to think you're making a data-driven decision or a outcome oriented call as a leader, and you are just wrong because you don't have the right people with the right expertise to check you on it.
I mean, you and I have been in really nuanced situations where how you listen to the data, interpret it and make decisions or adjust your model really impacts people's livelihood. And so there's a little bit of, it's an amazing circumstance in the market because all the right things could happen. It would be so cool. I imagine as a scientist, there's with AI, you can get highly more efficient and effective much more quickly because of it, but you're also like you know to police yourself from a security and ethics, right? Is that right?
Alexa Myers:
Yeah, for sure. I think that there are two things that are really fundamental in trying to mitigate that risk. One is something that has stayed the same and will always stay the same no matter the technology. That's the foundational data, like garbage in, garbage out. If you have bad data, regardless if it's AI, a simple linear regression, you are going to have bad output. That's something that I think will always stay consistent. I feel like people have thought with AI that that is no longer needed. Maybe we'll get there one day with technology, but I highly doubt it. I think that that's something that's always going to stay consistent.
The second piece is, and this is where I've really seen my role start to shift. I was saying that a lot of the times I was hands on keyboard. Now it's really with education and telling the story and working with clients and making sure in how they're leveraging AI, that they're educated in the way that they need to to make sure that they're making ethical decisions in it and that they are truly leveraging it to its fullest potential.
Jess Carter:
Yeah. That makes gobs of sense. Well, and you're a consultant. You do this for a living across a whole bunch of companies, private, public. I've seen you do this work. So the expertise you have is so rare because the whole industry is trying to figure out how to do this. We've got so many clients where I talk to them and they built a data science department or they hired a data scientist and they're so excited, but that data scientist has no friends. There's no other data scientists, there's no one else to check them on it, and nobody knows how to manage their outcomes, because they're not scientists above them.
And so, we could take this a bunch of different ways, but one of my questions for you might be, if I am a customer, if I'm trying to do something innovative as a leader, do I have to know when I need science or can I just be outcome-focused and say, this is the outcome I want. And when we present, hey, part of that is going to be we need to build out this model and we need this data to be clean enough and dah, dah, dah, dah, that's okay. What's your take on that? Should every executive know?
Alexa Myers:
I don't think so necessarily. I think that that's where we come in as partners with thought leadership and should be able to discriminate between when what is needed. I think a lot of the times, especially with as tools and technologies have rapidly developed, we always want to do the next shiny pretty thing. We want to be able to do. So when machine learning became really big, we want to do, there's a classification model called XGBoost, and it's very complex and it's really hard to explain to an end user, but sometimes you just need a logistic regression, which is very much consumable and it may give you the performance that you need. So I think it's our duty as data scientists and consultants to know when you need everything in the kitchen sink or when something much more simpler is just as equally impactful and it's easier for the consumer to digest and understand.
Jess Carter:
Okay. Yeah. So then if a company is trying to stand up its own science department, what advice would you give them? What are the things to really protect against? Because again, when I think about a scientist alone on an island, I've seen our scientist and it's like I picture us all in a lab coat in an eighth grade lab where it's like we actually do need to bounce ideas off of each other to validate each other's opinions, to make sure there's no biases. It has been important to have that. And if you're going to build a department, I imagine there's probably a few things where you'd be like, "Hey, maybe you've seen a client with a really successful department," or maybe you're like, "I wish I saw this." What does that look like to you?
Alexa Myers:
I would say one of the things that we have a wonderful data science team here at Resultant. I truly believe, I am of course biased, but work with some of the most hungry, humble, smart individuals that I have ever met. I think one of the things that makes that team work so well is we all come from different backgrounds. So, only a few of us have degrees in data science. For example, because the field is so relatively new, those with zero to five years of experience are ones who may have a degree in data science, but it's not necessarily a requirement to be a data scientist.
You need the foundational programming skills. You need the mathematical background and you need to be honestly, genuinely curious. And so we have individuals who have backgrounds in mechanical engineering. We have an individual who has a PhD in astronomy. We have PhD in particle physics. We have truly the widest range of individuals with backgrounds, but having them all come together with their different experiences, their different ways of thinking is truly what makes them exceptional.
Jess Carter:
Everybody can't afford that, right? And so that's when it does make sense to be like, "Hey, it's a new industry. It's a new concept." I don't ever want our podcast to sound like a commercial, but whether you use us or someone else, it might make sense to outsource that to a firm that does that for a living versus trying to build your own. I have seen a lot of companies that struggle with retaining those people because they can't afford to bring in the kind of department that a consulting firm can because we're flipping through science projects and people are used to that. There's probably some decision of when do you build versus when do you buy it by the expertise. So then one of my other questions was going to be when I think about a stakeholder, so if I were to map out all the projects you've done, which I would do okay at that, I'd get like a C minus.
It's not a data science project for this client. This client wants a program efficacy model. They want to understand if what they're doing is generating the value that they expected. They want to help inform its users in a different instance of ways they could make better data-driven decisions as easily as possible with one to three clicks, and they kind of can't ignore it in the way that we design it, right? WRE, which is a workforce department of workforce development we've talked about on another episode. So this is where I'm like, I think what's neat is if leaders just focus on asking for the most right thing for their business, they're probably going to come across data science. And then when they do, this is my next question for you, build up, when you provide status updates about data science to executives who are not familiar with it, it's a different animal.
I remember my first... So I was doing big old lift and shifts back in the day of system transitions, and it was more about the system that we were moving from and to. Then we realized data was a product, and then I started running all these data products or projects of how do you build data products and how do you value them, and then cleanliness and governance mattered. And then we started to inform models for outcomes, and that's when I first encountered science and I thought it was really interesting, but it's like if you're doing lift and shifts, you could spend your whole career in the last 50 years doing those. You know exactly how as an executive, you know what questions to ask, you know how to audit it, you know how to check your vendors. If it's app dev, it's usually sprint based and there's burn down.
There's all these different ways that the market makes all of those things. At some point you've done it three to five times and you can kind of manage it as an executive. Science is different because it is science. You go through the scientific model or process and you get to stage six or whatever, and you're realizing your data isn't as clean as you thought and it's blocking you from the outcome you wanted, and you either have less, you're adulterating your outcome or you have to go back. And so to me, it is a different thing to help your stakeholders understand how to effectively manage it, make decisions around it and understand the progress you're making because sometimes the progress is we're on our 18th model and it's still not working, and at some point a client's going to say, "How many more of these?" And you don't... It's like Einstein, until the light bulb works, the light bulb doesn't work.
Can you give us some advice about how much does a stakeholder have to understand in order to be an effective leader over a science project?
Alexa Myers:
Sure. I am always a big believer in that data science doesn't have to be a black box, and I think that regardless of your technical background, it is important to understand even at a 10,000-foot view what's going on under the hood. That's where I think once again, it's our duty as data scientists to make sure that our stakeholders are understanding how things are working at a level that they need to. Sometimes that can be really challenging and that people understand it in different ways as well. And so how you explain it to one individual may need to be different.
You had mentioned the workforce recommendation engine with the Indiana Department of Workforce Development, and this was one of those examples where that happened. I had to explain it, breaking down the individual components of, so the workforce recommendation engine at a high level is a hybrid recommender system, which is a machine learning algorithm that you train, but how I broke it down at a high level is Netflix uses the same type of algorithm, and so I screen shotted my top picks from Netflix, walks through how the content and collaborative filtering worked, leveraging Netflix of Netflix will recommend more documentaries to me because I like documentaries.
And then I'll also look at people who like documentaries and see what they have watched and then will recommend to me a documentary that I haven't seen. That's how I explained WRE of that's a easy thing that individuals can grasp, but Josh Richardson, who is now the commissioner at Indiana Department of Workforce Development, he got it, but then he paused and he was like, "But we do so much more than recommend a movie. We're recommending occupations, but it doesn't stop there. There's so much additional information that it's provided."
It was one of those moments where I paused and I was like, I have this great example. This is exactly how this works, but I saw the light bulb go off in his head. He got it, but then even took it a step further of this actually isn't a really good example because we're recommending something yes, but then we have B, C, D and E in addition to it. And so I think that that's an example that I always go back to of everybody is going to interpret something differently and just making sure that you're meeting your audience where you're at.
Jess Carter:
Yeah. I would argue as someone who knows him, he's a pretty sophisticated audience member, right?
Alexa Myers:
A hundred percent.
Jess Carter:
He gets it. And so I think that one of your undervalued skills in the market is your ability to communicate. Does that shock you?
Alexa Myers:
No. I think that it's something that is going to continue to be more essential in the data science field though. I had said it earlier of I've really seen my role shift from more of a hands-on keyboard to more engaging stakeholders in telling the story. That's why I went back to school and got my master's in data science with an emphasis on data science storytelling. So truly being able to break it down and be able to explain really complex algorithms to non-technical audiences and understanding where to meet them, where they're at and at what point.
Jess Carter:
Oh, my gosh. My hope would be that if someone is listening to this eventually and finds themselves knee-deep in a science project and they're struggling, just the ability to be like, you're not alone or crazy. This is really difficult and different. What I like about it in all the AI stuff, my number one concern is making sure we're using it in a way that allows us to spend more of our time being human. And what I love about that is science projects innately require humans. There is a level in which I think about times when I was with Dan LeBar, one of our coworkers and scientists on the COVID response, which we've talked about on the podcast in the past, and it was like that was one where we were on our 18th instance of what wasn't working and the client was really struggling, but no one's figured this out.
I remember when it was like, well, why does this model work perfectly in Italy, the Twinset model, but it wasn't working here? And it was like, well, what's different about... Well, the population is largely older, but there was no AI that was going to indicate those kinds of insights or reflections. There's this human need in the workplace for reflection and communication that I think even with the smartest people is important. Right?
Alexa Myers:
Yeah, I would agree. I mean the need for a human in the loop will always remain there to that point of there are just things from a human understanding that a semantic model just isn't going to be able to grasp those one-off instances like that. Going back to ethics and privacy and how to leverage row-level information on individuals like a model isn't going to know of should I use this or not? Those decisions are things that are going to have to remain with humans and making sure we are leveraging AI in the appropriate way to enhance what we are doing and not replace.
Jess Carter:
Yeah. Well, and to go back a topic too, because when I think about you and some of this, this is where we're stumped with a scientific problem and we need friends who are smart and also scientists. So, that's why I think in my opinion, there's probably a situation where it's appropriate. A scientist on an island is not a good thing. It's kind of like I would equivalent it to my dad led sales for consulting firms and non-consulting firms, software, hardware products for his whole career, and he ran sales and he would always say a salesperson alone, they need high levels of engagement of positive public praise, carrot sticks. They need that touch point celebrating them. They're a different type of person and they really need that energy. I feel like a scientist needs that level of academic rigor, like challenging problem solving.
If any of us have been cooking dinner and we've turned off our work brain and suddenly we're like, "Oh my gosh, I know." If you've had that moment, it's because you put your problem on ice and you went and did something different. A bunch of us do or used to rock climb. It's like you're solving a different problem and it kind of helps your brain. I feel like a scientist needs someone else's problems. They got to have this peer review access to be like, "Well, I tried this over here. Would that work over here?" The data's different. So I just think how we manage those teams and make sure that we're not alone as stakeholders, they're not alone as scientists is factually important. I think that there's going to be data that comes out and says, "Please don't put them on an island."
Alexa Myers:
I would agree. I think one of the things that is really cool you had mentioned earlier is you never really start with a data science project. Data science is weaved into it, and so your end state is a application or a dashboard and data science is a component of it. Most of the times our team has the opportunity to work with our application development team, our data engineering team, our business intelligence team. And so with that, we get to truly collaborate and bring different minds together. I was going back to why I think our data science team is really great, but even then thinking about how we are developing and ensuring that, how we are incorporating that into the in-state solution that our other teams are building is in the right way. And so sometimes when we are thinking through that, it sparks ideas that we may have not thought of about in the first place.
Jess Carter:
Okay, so you just brought me to one of my last questions I had for you this time in that way, a lot of times, so science is new. There are a lot of people that are generally maybe younger in that space, if I can say that, but you're working with some of the most senior engineers and architects, which sometimes are more seasoned. And so I think my hypothesis, which you could correct me if I'm wrong, is that you actually might have a really interesting take on multi-generational workplace, which is a topic that's coming up in leadership conversations. I've had it come up three times this week for me in different capacities with leaders where they're like, this is a different kind of, the generations are really different and how we work together is really different and no one has a silver bullet of what is working and what isn't. Do you have an interesting perspective on that? I feel like science is sort of breaking down all the walls in a really interesting way.
Alexa Myers:
I would say one thing that I had mentioned earlier that, I mean one piece of advice that I do give to early careers... Early career mentorship is something that I'm very passionate about. I had great mentors, but I first started out and it's something that I wouldn't be here in the role that I am in without the mentors that I had. Really try to give back to that. And one piece of advice that I always try to give is just because you're not the smartest person in the room or who you perceive as the smartest person in the room doesn't mean that you don't have something to add.
With that, what I think is unique about the data science field, something that I had mentioned earlier, the majority of those principals who are those high senior individuals didn't have the opportunity to get an education in what they are doing or newer early careers, a zero to five, like I said, they're the first people who have had the opportunity to sit down in a classroom and say, "This is how you do this." Everybody else was building the plane and flying it and learning how to do it in their occupation as it continued to morph over time. And so they bring this really unique perspective of they were taught the data science process. Other people worked with that in the education that they have. They have this unique background in comparison to other people who may have been in the field longer solely just because of their generation. So they bring different things to the table from once again, different like minds solely based on that education.
Jess Carter:
I think that's a really good perspective. That's where I see how you guys work. I see the beauty too of that paired with these super senior engineers turned scientists, these people who have decades of access and a lot of our work is in state government. So one of the things that's really appealing to our data scientists I imagine is you don't get to work with row-level data of this complexity, size, and outcomes everywhere. We have a lot of scientists who love what they do and are driven because they've been able to work with this kind of human data, a lot of PII, a lot of data that's very protected, highly secure, but that marriage of experience and process training is really neat. I'm sure it's riddled with difficulty at times, but it's so exciting. And then you have maybe these clients who are excited too, but I bet maybe they don't understand it, they don't know how to manage it, they just care about the outcome.
So, it's a messy prospect, but I think that with the storytelling skills, with the communication skills, with the ability to be open and willing and transparent about what we're doing and how, I feel like you guys are doing a phenomenal job, and I think that that's where there's this open invitation to say, "Whatever generation you are, you bring value. Let's all work together and figure it out." So in some ways, while it's the hardest, it feels so complicated. It also might be the most valuable. Working together in these ways on these projects, we're getting aspects and vantage points that we've never had before.
Alexa Myers:
Absolutely. I would agree with that a hundred percent. Even once again, not even just within data scientists and data scientists, but cross functionally as well across teams or with that, even if you're saying an idea, it may not be the best idea, but it may spark something else from somebody else.
Jess Carter:
Yeah. Yeah. That's awesome. I had one more question for you, and then if you have something else you want to talk about, we can. The future of data science, I would love to hear where you feel like it's headed next, and I realize that's like asking the Wright Brothers where they think air travel is going five years after they flew a plane for the first time, but I am genuinely curious. Do you have any insights, trends that you're expecting?
Alexa Myers:
I would say one thing that I think that we're going to start seeing is leveraging AI to replace broken AI solutions. With the hype of AI that happened, and I think we're starting to come down on the hype train a little bit of, we shoved in these AI solutions with not necessarily thinking about how they should have been implemented and they didn't really solve the problem. So, I think that we're going to start seeing those becoming more frequent and going in and actually deploying the solutions in the way that they should have been.
Jess Carter:
Speaking of, I just saw this research, I don't know. I mean, this is kind of speaking of. I think that that's really interesting. So more like leveraging science to help make AI more valuable. There was this research that just came out that, I forget the entity, I'll put it in the show notes if I can find it, but it was basically saying an entity-fed AI, all of those clickbait articles that people get, and it made the AI dumber. That's my word, not theirs, but they were like, you could see that its problem solving went down and they couldn't repair it, they couldn't feed it enough good stuff-
Alexa Myers:
Interesting.
Jess Carter:
... for it to be back at the quality and fidelity that it was when it started. I was like, "We know. We know this." But what an important testament back full circle to let's make AI not a silver bullet because it's not. Let's use it for the right things at the right time and engineer it properly, and then let's pay attention to being human. When you feed artificial intelligence, some of the stuff that a lot of human beings are consuming, look what happens to it. Let's extrapolate, and maybe that's not scientifically appropriate yet, but I imagine the inference is applicable and allowed to consider. Are you doing things as a human being and a leader that are bringing about better, faster, stronger leadership, or that might be adulterating.
Alexa Myers:
I couldn't have said it better myself.
Jess Carter:
It's neat to work with other human beings who work with AI all the time, do data science all the time. We are allowed to be concerned too about its application and use.
Alexa Myers:
Yeah.
Jess Carter:
Oh, my gosh. Thank you for being here. I really appreciate this lived experience. I think people see data science, to your point, as a black box, and you have taken a huge stride here today helping a bunch of data-driven leaders understand more about it. So thank you.
Alexa Myers:
Thank you.
Jess Carter:
Thank you for listening. I'm your host Jess Carter, and don't forget to follow the Data-Driven Leadership wherever you get your podcasts and rate and review letting us know how these data topics are transforming your business. We can't wait for you to join us on the next episode.
Insights delivered to your inbox