42 Min Read

[The AI Show Episode 194]: Agentic AI Timelines, Generalists vs. Specialists, Resume Tips, AI Learning Ownership, & Handling Model Updates

Featured Image

Is your AI pilot failing because the model isn't smart enough, or because your culture is resisting the change? 

In this AI Answers episode, Paul Roetzer and Cathy McPhillips tackle the invisible friction stalling AI adoption. They go beyond the constant hype of new model releases to confront the messy reality of implementation. The conversation dissects some of why middle management is the critical layer for normalizing AI, how to handle employees who generate disproportionate value, and why generalists are suddenly outperforming deep specialists.

Listen or watch below—and see below for show notes and the transcript.

Listen Now

Watch the Video

What Is AI Answers?

Over the last few years, our free Intro to AI and Scaling AI classes have welcomed more than 40,000 professionals, sparking hundreds of real-world, tough, and practical questions from marketers, leaders, and learners alike.

AI Answers is a biweekly bonus series that curates and answers real questions from attendees of our live events. Each episode focuses on the key concerns, challenges, and curiosities facing professionals and teams trying to understand and apply AI in their organizations.

In this episode, we address 15 of the top questions from our January 15th Scaling AI class AND our Marketing Talent AI Impact Report Webinar, covering everything from tooling decisions to team training to long-term strategy. Paul answers each question in real time—unscripted and unfiltered—just like we do live.

Whether you're just getting started or scaling fast, these are answers that can benefit you and your team.

Timestamps

00:00:00 — Intro

00:06:11 — Question #1: Who owns AI learning: L&D or departments?

00:09:52 — Question #2: Hiring dedicated AI change management consultants. 

00:11:54 — Question #3: Middle management’s role in normalizing adoption.

00:14:27 — Question #4: Signals a pilot is failing due to culture, not tech. 

00:16:12 — Question #5: Balancing learning pace vs. rapid experimentation. 

00:20:11 — Question #6: Hiring for critical thinking and AI skills.

00:23:31 — Question #7: Experience vs. Adaptability in talent acquisition. 

00:25:35 — Question #8: Protecting and compensating AI leaders.

00:27:56 — Question #9: Using AI with confidential data restrictions.

00:30:35 — Question #10: Realistic timelines for AI agent advancement. 

00:33:21 — Question #11: Managing model selection and "agent chaos."

00:37:24 — Question #12: The rise of the Generalist vs. Specialist. 

00:41:14 — Question #13: Proving AI skills beyond certificates. 

00:44:25 — Question #14: Trust and authenticity in AI content.

00:48:35 — Question #15: AI SDRs: Vendor questions vs. building in-house. 

Links Mentioned


This episode is brought to you by Google Cloud: 

Google Cloud is the new way to the cloud, providing AI, infrastructure, developer, data, security, and collaboration tools built for today and tomorrow. Google Cloud offers a powerful, fully integrated and optimized AI stack with its own planet-scale infrastructure, custom-built chips, generative AI models and development platform, as well as AI-powered applications, to help organizations transform. Customers in more than 200 countries and territories turn to Google Cloud as their trusted technology partner.

Learn more about Google Cloud here: https://cloud.google.com/  


Read the Transcription

Disclaimer: This transcription was written by AI, thanks to Descript, and has not been edited for content. 

[00:00:00] Paul Roetzer: The fact that you invested the energy to do the thing is actually what makes it worthwhile. It's why. human art differs from ai art. Welcome to AI Answers, a special Q&A series from the Artificial Intelligence Show. I'm Paul Roetzer, founder and CEO of SmarterX and Marketing AI Institute. Every time we host our live virtual events and online classes, we get dozens of great questions from business leaders and practitioners who are navigating this fast moving world of ai.

[00:00:30] But we never have enough time to get to all of them. So we created the AI Answers Series to address more of these questions and share real time insights into the topics and challenges professionals like you are facing. Whether you're just starting your AI journey or already putting it to work in your organization.

[00:00:48] These are the practical insights, use cases, and strategies you need to grow smarter. Let's explore AI together.

[00:00:59] [00:01:00] Welcome to episode 1 94 of the Artificial Intelligence Show. I'm your host, Paul Roetzer, along with my co-host today, Cathy McPhillips, our Chief Marketing Officer at SmarterX. Hello, Cathy. 

[00:01:11] Hello. 

[00:01:11] Paul Roetzer: if you're wondering why Cathy is with us today is because AI Answers is a special edition of the Artificial Intelligence show.

[00:01:19] This is, in addition to our weekly that Mike and I do. Every other week or so, we do an AI answers episode that is presented by Google Cloud. It is a series based on questions from our monthly intro to AI and scaling AI classes. So if you're new to those, we, every month we do a free intro to AI class and a free five steps to scaling AI class.

[00:01:41] And so, at each of those classes, we get dozens of questions that we can't get to during the live session. And so. We, the week following, we record and then air a special edition where we answer some of those questions. So today's is, following scaling ai, the last class we [00:02:00] did, and we are also mixing in some questions from a marketing talent AI impact webinar that we hosted on January 27th.

[00:02:08] So we released a new report as part of our AI industry council. And we had amazing questions from the audience, so we've mixed in some of those questions as well. So again, AI answer episode that is, special. We do it biweekly. This one is featuring specifically questions from scaling AI and, the marketing talent AI impact webinar.

[00:02:28] So again, special thanks to Google Cloud for sponsoring this series as part of our AI literacy project, where we try and make AI education as accessible as possible to as many people as possible. So the Google Cloud marketing team's been an amazing partner of ours for the last year or so. In addition to sponsoring the AI Answers podcast series.

[00:02:47] They're our partner on the intro to AI and scaling AI classes, as well as a collection of blueprints and the marketing AI Industry Council. You can learn more@cloud.google.com. And then a few notes. Cathy, before we get [00:03:00] diving right into these questions, MAICON.ai one, you can go learn more about MAICON.

[00:03:05] It's coming out October 13th to the 15th. This is our seventh annual conference. We're expecting 2,500 plus. I think I'm safe to say, Cathy is our, 

[00:03:13] Cathy McPhillips: you are safe to say, 

[00:03:14] Paul Roetzer: okay, the 2,500 plus is our goal there. but the key here is the call for speakers is open. So if you have a great story to tell an AI transformation story.

[00:03:25] specific knowledge and capabilities that could fit into our strategic AI track, or our applied AI track. We would love to hear from you. There is a link to submit, to speak at MAICON, that is there now. So we are in an active call for speakers for MAICON 2026, and we would love to have you submit, if you've got a story to tell.

[00:03:43] we also have our AI for Agencies Summit coming up on February 12th. This is a free virtual event presented by Screen Dragon. there's already, I think we're well over 1500 already registered for that event. We're expecting probably 3000 for that event. It's from [00:04:00] noon to five Eastern on Thursday, February 12th.

[00:04:02] Again, that is ai for agencies.com. You can go learn more about that. And then one other free thing I'm gonna mention is we have, AI for department webinar series week happening. We're really excited about this. It's a new way we're trying to do this. We are launching new, downloadable free blueprints that are specific to AI for marketing, AI for sales and AI for customer success.

[00:04:26] But the way we're gonna do that is we're gonna host a live event. each day on February 24th, 25th and 26th to give a preview of those blueprints and then to make those blueprints available so you can register for one of those or all of those. And the URL for that, Cathy, is SmarterX AI slash webinars.

[00:04:44] Is that right? 

[00:04:45] Cathy McPhillips: That is correct. 

[00:04:45] Paul Roetzer: Okay. So again, AI for department webinar series, totally free. February 24th, 25th, 26th. And then you have AI for agencies. That is a free registration option on February 12th. So again, all of all part of this AI literacy project, we're trying to do as much as we [00:05:00] can to make as many of the educational.

[00:05:03] programming and events that we offer, as accessible as possible to anyone. So if I missed anything, Cathy, go ahead and hit it. Otherwise, we've got, it looks like 15 questions we're gonna get through in the next. 40 minutes 'cause I have a meeting. 

[00:05:18] Cathy McPhillips: Yep. I mean, it is like AI all the time around here. 

[00:05:20] Paul Roetzer: It really is, and it doesn't, my life doesn't change.

[00:05:23] I still have a million meetings and I still am the CEO of this company. So all these podcasts, we just sort of squeeze it into the hour that's open in my schedule. So, 

[00:05:31] Cathy McPhillips: yep. 

[00:05:31] Paul Roetzer: I have not looked at these questions. We're gonna go just like we do in the live and see I've answers. If I don't, I'll punt it and tell you I don't, I don't know.

[00:05:39] Cathy McPhillips: That's the best thing you get. You're honest about it. 

[00:05:41] Paul Roetzer: Yeah. 

[00:05:41] Cathy McPhillips: Okay, let's jump in. So the way we do this is we take the questions from everything that we're doing. We put them through, AI to help us prioritize what we haven't answered on previous AI answers, things, questions from the class that we thought were really good, that our listeners would like.

[00:05:58] Claire and our team, [00:06:00] is the human that looks through all of these as she sends 'em over to me and I run them through to make sure that they flow. That just a second set of human eyes to make sure that Paul hasn't answered it six, six times. so here we go. 

[00:06:11] Question #1

[00:06:11] Cathy McPhillips: All right. Number one, if training and education consistently block AI progress, who is actually responsible for AI learning today?

[00:06:19] Our l and d team stepping up Or is ownership shifting by company size? 

[00:06:24] Paul Roetzer: This is definitely, I don't think there's a universal answer to this. I think in larger enterprises, l and d is, is certainly, the most likely party responsible for this. I think I might've mentioned this recently and again, I have no idea what webinar or podcast it was on, so I'll just say it again.

[00:06:41] One of the things we see, and again, through our AI Academy by SmarterX, we, we talk to a lot of businesses that are buying licenses for their teams or their departments or their organizations. And what we often find when we're talking to teams, so let's say like A CMO reaches out and wants to get licenses for the whole marketing team, one of the first [00:07:00] questions our, business account team will ask and the discovery call is, who's gonna own this?

[00:07:05] So it's like gr great. It's kind of like if we go buy 50 copilot licenses and give everybody. It's not gonna work. And education is the same way. You show up, you buy 50 licenses to our AI Academy. There's a wealth of stuff that can drive adoption real fast. But if someone doesn't own making sure that it's actually gonna happen, then it's gonna go nowhere.

[00:07:27] Right? So if you don't have an L and D team taking the lead on this, or an HR team or whatever it might be. If you're going to commit to investing in education and training, which you should be doing, whether it's with us or someone else, someone has to own that and not just like. Coordinating it. I'm saying own the goals of that program.

[00:07:47] So if you have goals around number of GPTs built, time saved, new projects launched, I don't, whatever your goals are for the education program, someone has to own that and have responsibility to that. So [00:08:00] we do not see a universal approach. We see a lot of people who don't have owners for this, honestly.

[00:08:05] Cathy McPhillips: And course completion does not equal success. 

[00:08:08] Paul Roetzer: It's a, it's a leading indicator. Like, you know, we look at things like that ourselves. And actually, ironically, right before this was. I was coming on and so I obviously haven't talked to Cathy about it. It's literally just on my whiteboard. I was thinking about like back in HubSpot's early days of HubSpot when we were, my agency that I sold was their first partner back in oh seven.

[00:08:27] They had something called Q and it was a customer happiness index, and they used it to actually monitor utilization of the software as a leading indicator to happiness with the product and then like renewal. I have no idea if they're still using a form of Q today. But I was thinking about like what, what is our Q like model?

[00:08:44] What is our success score within our team that tells us that someone is not only getting value from the academy, but actually is likely an indicator of true transformation happening in their company. And so I think some concept of that, which may be once I build [00:09:00] that next week, I'll, I'll may maybe it'd be helpful to share that with people because I think there are leading indicators that can tell us.

[00:09:07] that you're probably heading toward a successful adoption of an education and training program. 

[00:09:12] Cathy McPhillips: Yeah. And when we hear from people in our community saying that they've been successful with this, it's like, we wanna hear those stories and hear how you did it. Yeah. You know, Noah is hearing stories from customers every day on things that they're doing, and how can we take all of that?

[00:09:25] Then better be able to serve our customers, saying like, okay, thanks for buying courses, but we wanna set you up for success. 

[00:09:29] Paul Roetzer: Yeah. We are actually in the midst of getting ready to launch an AI Transformations podcast series, so another special edition that we'll do where we'll feature these stories. And then as part of AI Academy, we're working on a similar concept where we'll feature community members, mastery members, and their stories of transformation personally and at the company level, because I think we all want to hear those stories and be inspired by them.

[00:09:50] Cathy McPhillips: Yep, absolutely. 

[00:09:52] Question #2

[00:09:52] Cathy McPhillips: Number two, would you recommend bringing in a dedicated change management consultant for an AI initiative? And if so, at what phase do they create the most value? 

[00:10:02] Paul Roetzer: I would definitely consider it. I think it's much like the education and training thing. If you have someone on your team who can fill that role, that might be one of those.

[00:10:10] You know, what are the new roles people are gonna have thing, and maybe that's, that's one of 'em. Maybe it's an operations person that's focused on AI change management. Maybe it is someone on the HR side. I don't know. I'd like to, your organization would sort of dictate where that would come from, but if you don't have those people on your staff, then I would think of it almost like an EEOS consultant.

[00:10:28] Like if you wanna put the EOS system into an organization, you look around and say, do we have an integrator that can do this? And if not, then you might, maybe you hire someone who does EOS consulting. So I, and I know Cathy, you talk with our community members more than I get the chance to, but there are people who are putting themselves in this sort of position to be those change management consultants who can come in and do that exact thing.

[00:10:51] So I think for a lot of organization that's gonna be the faster path, is just bring someone in who can help advise on this. And then when the right time is, it's probably more a [00:11:00] question of when is your organization ready for it, to have the change management consulting versus just running some pilot projects?

[00:11:07] Cathy McPhillips: Or is that their. Consulting role is to figure out, are we ready? What do we need to put in place to be ready? 

[00:11:13] Paul Roetzer: Yeah. All I know is I've talked with a number of consultants who've been wanting to do the change management consulting for a couple years and are qualified to do it, and their clients and prospects weren't ready for that conversation yet.

[00:11:26] It's like they were offering a service that was needed, but the clients didn't really know they needed it yet. They just wanted to know how many ChatGPT license should I get and what GPT should I build? And it's like, oh yeah, yeah, we'll do all that other stuff. Down the road. That change management stuff sounds great, but like, I don't need it right now.

[00:11:41] I just need people to actually use the tools. So it really is a readiness thing for an organization of when being honest with yourself, when are you ready for that to happen? 

[00:11:54] Question #3

[00:11:54] Cathy McPhillips: flows nicely in number three. what role should middle management play in normalizing AI adoption without [00:12:00] creating fear or resistance?

[00:12:01] Paul Roetzer: I think the job of any manager, whether it's middle manager or above, is to, to help people find practical use cases that make an impact on their job every day. So I'm obviously as big a proponent as anyone in like, we should be using AI to drive innovation and growth and do all these amazing new things.

[00:12:23] But the reality in most organizations is they still need someone to get 'em that first 10 yards. Like they need 'em to get to that first success of like, oh, okay, so this is what AGI PT does. Like, now I get it. Can I, okay, can I build a couple for this? Like, and so I think middle management in a lot of ways, or any management's function is going to be helping people.

[00:12:45] Get to that first success and then stack successes. So they are focused more on optimization, efficiency, productivity early on, but they need a lot of handholding. Like there, there's always gonna be that 10 to 20% of your team who are racing [00:13:00] ahead and they're trying all the new stuff. But then there's gonna be that messy middle part where they kind of want to do this, but they don't know how.

[00:13:07] And then there's gonna be. The 10, 20%. I want nothing to do with this. And so I think the job of managers is probably gonna be to focus your energy on that middle part of the people who want to do it, but don't know how to do it. 

[00:13:19] Cathy McPhillips: So things like guidelines. You know, guardrails, enable sample prompts enable enabling them to telling them to go do it.

[00:13:27] A lot of that is just like, they don't know what they don't know and 

[00:13:30] Paul Roetzer: Correct. 

[00:13:31] Cathy McPhillips: They dunno. They can, 

[00:13:32] Paul Roetzer: yeah. Without saying, okay, you know, salesperson, here's five recommended use cases with sample prompts and we actually built you AGI, PT, like. We'll take an hour and let's go through this together. Right. And then customer success team, you know, here's what we've designed for you.

[00:13:47] So I think these playbooks where you actually roll it out and you do personalized examples for people is the most fundamental way to do this successfully. And so few organizations are doing it that way. 

[00:13:59] Cathy McPhillips: Yeah. And just [00:14:00] knowledge sharing. You know, like yesterday Jeremy was doing, he was doing something this week and I said, okay, can you actually.

[00:14:05] Can I, can you screen share with me while you're doing this next? 

[00:14:08] Paul Roetzer: Yeah. 

[00:14:08] Cathy McPhillips: Or can you just record it so I can watch it? Because you can tell me all day long, but I do better visually. So if I would like to learn how you're doing that. 

[00:14:16] Paul Roetzer: Yep. 

[00:14:16] Cathy McPhillips: And that's not AI necessarily, but it's any that does certainly apply to all of the AI tech that we're using.

[00:14:21] Paul Roetzer: Yeah. And Jeremy's a director of marketing on Cathy's team, if you don't know who Jeremy is, 

[00:14:27] Question #4

[00:14:27] Cathy McPhillips: number four. what signals tell you an AI pilot is failing because of culture and not technology. 

[00:14:36] Paul Roetzer: If you thought it through and personalized it to the individual or the team and they're still not doing it. So, you know, let's say you roll out ChatGPT to the marketing team or the sales team or the CS team, and in that rollout process, you do a a one hour kind of workshop demo.

[00:14:55] Here's what it does. Here's and marketing team, sales team sees, here's sample [00:15:00] prompts we've built for you. Here's gpt. You can each use. Like you've done the hard work upfront to make it easy for them to adopt, and then you monitor daily or weekly active usage and you don't see it happening that that tells you it's failing because of culture or a very tight window where, okay, maybe people were just too busy that seven day period, but if you look over a 30 day, 60 day period and you're seeing a lack of utilization even after you've done the personalization of the technology to them.

[00:15:31] Then you, you may have a culture issue. The other way you can do it is, potentially upfront just surveying people before you launch all these pilots and getting at the sentiment in the organization of like, you know, how excited are you about integration of AI into your roles and things like that.

[00:15:48] Assuming people are gonna be honest, you, you could probably very quickly realize like, wow, okay, there are 40% of people who are. Neutral at best to the use of AI in their jobs. [00:16:00] That's gonna tell you right away, you've got some other barriers to deal with, and it's not gonna be as simple as giving 'em some education and training and some tech, and assuming they're gonna jump at it and start adopting it.

[00:16:10] Cathy McPhillips: Sure. Okay. 

[00:16:12] Question #5

[00:16:12] Cathy McPhillips: Number five. As roles are reinvented by AI learning becomes a moving target, and time for learning is finite. How should leaders think about the size, pace, type, and volume of learning versus the need for rapid experimentation and deployment? 

[00:16:28] Paul Roetzer: I think personalization, again, comes back to the answer.

[00:16:30] So we are actually, we were in a conversation about this a couple days ago, internally with Jess, our head of learning and some of the other people on the team, as we're trying to solve for AI for executives. So this is something I've been working on for a while and trying to envision like, what does that look like within the platform, within our AI Academy.

[00:16:47] because we get people reaching out to us saying, Hey, we want to educate our CEO, but he or she isn't gonna take six hours and do something like, I think I can get 60 minutes with them next month. Like, what should I prioritize for [00:17:00] them? So we intentionally design our academy to enable these like, personal learning journeys.

[00:17:06] And so what I was thinking of on the AI executive, side is. I think it has to be like 45 minutes, like personalized to the individual role. Like say, CEOs or CEOs. And by the way, I probably shouldn't even be divulging all this. Like I 'm just kinda like telling you the roadmap, but if I'm the CEO and I want to know this, I know I could give you an hour of my time, like that's about it.

[00:17:29] and whether it's like audio format or I'll actually go in and watch the course or someone's gonna like, come in and show the course with me. We're gonna talk whatever it is. I know I've got that. And then you want this like custom learning. So the CEO doesn't care about a certificate like. That's the C-suite level.

[00:17:45] They're not there for the professional certificates to put it on LinkedIn and to like show, build their resume. They just want the fundamental knowledge so they can make better decisions in their organization and guide their team. So when I think about that, it's like, okay, we have to design our curricul 

[00:17:59] [00:18:00] to accommodate that persona or that role versus someone who's like, I wanna become a change agent within my organization. I want all the knowledge, I want every certificate you all can create. I want to know different industries. And so you have to think differently about what is the goal of the individual.

[00:18:17] And then how do you adapt the curriculum to them over time? And so that's why I say personalization is really the only way to address this. And some people want our Gen AI apps, we drop a new one every week. And I'm sure there are people in our Community mastery members who are watching every one of those.

[00:18:34] Cathy McPhillips: Mm-hmm. 

[00:18:34] Paul Roetzer: And then I'm sure there's other people who are like. What we're trying to consider is, okay, let's like make sure there's a long tail of these because maybe somebody just wants to know how to use AI within Google Sheets or Excel and like that's the one thing they're gonna watch 'cause it's like super relevant to them and they're not so interested in the latest video generation technology.

[00:18:56] And so for us, we see that as our job is to create [00:19:00] this very diverse, collection of. Very relevant and near real time education so that people can adapt learning journeys. But I think individually we have to like consider where people are, what their goals for learning are, and then be realistic about how it's gonna fit into their schedules.

[00:19:18] Cathy McPhillips: Sure. And then even for like the Gen AI app series, just to frame it around our education, but there are other examples. Obviously we were, when we first started, it was like, let's do gen ai, different technologies every week. And now it's turned into different use cases because Correct. The same tool across your organization can be used so many different ways.

[00:19:36] So focusing again on what problem is that person, that department trying to solve for. 

[00:19:41] Paul Roetzer: Yeah. And the way we've kind of guided the gen AI apps. Now it's like we're very focused on features within apps and platforms versus the whole platform because again, the Gen AI app reviews for us every Friday is supposed to be 15 to 20 minutes.

[00:19:53] You, you can't review ChatGPT in 20 minutes. So, but like you can do the agent mode in ChatGPT in 20 [00:20:00] minutes. And so that's what we think about the Gen AI app as like features within apps and platforms. Yeah, so that's like quick hitting and meets like that very specific need. 

[00:20:10] Cathy McPhillips: Okay. 

[00:20:11] Question #6

[00:20:11] Cathy McPhillips: Number six. This is a really good question.

[00:20:13] Paul Roetzer: They've all been really good so far. 

[00:20:14] Cathy McPhillips: Yeah. We often hear that education and awareness are the biggest AI roadblocks, but our struggle is more about whether people have the right skills to use AI as thinking as a thinking partner and automation tool. What specific skills should companies prioritize when hiring and how do you actually assess things like critical thinking?

[00:20:32] Paul Roetzer: This is a really good one. I feel like there's like multiple questions in here, so I'm trying to like. Harsh this in my brain answers. So I think the education, the awareness is. Often to me, probably the leading indicator, the lack of education, I think many times comes from the lack of awareness at the executive level.

[00:20:49] So if there's a lack of awareness about AI capabilities, then you don't commit to building the right education. You don't commit to internal academies, things like that. So those are very significant roadblocks. And then the [00:21:00] skills, to, to know, to use it as a thinking partner automation tool, things like that.

[00:21:07] That comes from proper education, reskilling, upskilling, where you're showing them practical ways to do this. So I think there is this mix of just this foundational knowledge needed of what is a reasoning model. So if I'm gonna use it as a thinking partner, I almost need to understand what the reasoning capabilities are of these models, because if I don't know that they can go through this chain of thought and they can build a plan to do something.

[00:21:33] I'm gonna be probably less reliant on it as a thinking partner. So, I don't know. I mean, in terms of skills to actually look for and prioritize in hiring, I think it's probably still, you know, if we're looking at critical thing and stuff, you, you're just doing problem solving, but where they don't have access to the ai, like mm-hmm.

[00:21:54] I want you, I wanna, here, here's a problem we're trying to solve. How would you do this? [00:22:00] Like build your action plan to solve this? Like let's talk about that. Okay. Now if I give you chat, GPT, how would you do this? 

[00:22:08] Cathy McPhillips: Right? 

[00:22:08] Paul Roetzer: And like, let's do a real live demonstration. Because what I would want to see is what are their follow up prompts and questions to the ai?

[00:22:16] So an example I've given recently is I'll often when I'm doing it as a thought partner, I'll say, let's go through this step by step. Because the AI wants to just solve your problem right away. And they're like, oh yeah, here you go. Here's a thousand words on the question you asked me. It's like, no, no, no.

[00:22:30] I wanna do this together and I want to understand each step you're taking. And I might take us a different direction. If I interviewed somebody who explained what I just explained, I'd be like, you're hired. Like let's go. Like that's awesome. So I don't think enough people know that though. But I also think we can't be.

[00:22:48] Overly judgmental of people that don't know because the company they're at might not have provided them the training to know that. So then you do have to actually just zoom out and say, okay, let's think about the human capabilities here. And do they [00:23:00] transfer to working with AI once they're given the proper tools and training?

[00:23:04] Cathy McPhillips: Right? 

[00:23:04] Paul Roetzer: And that is where you just have to get at are, are they intrinsically motivated? Are they, you know, do they think well on their toes? are they good communicators? Are they strong writers? Like all that stuff still matters, 

[00:23:15] Cathy McPhillips: right? And are there a hiring managers who know how to even assess for that right now?

[00:23:20] Paul Roetzer: That's tough. 

[00:23:22] Cathy McPhillips: Yeah, 

[00:23:22] Paul Roetzer: there aren't many. 

[00:23:25] Cathy McPhillips: Okay, so Tuesday we had the marketing talent AI impact webinar and reports. 

[00:23:31] Question #7

[00:23:31] Cathy McPhillips: So this is coming from that, webinar specifically, do you see AI driven talent acquisition trends extending beyond marketing? Does age or years of experience matter as much as adaptability?

[00:23:43] Paul Roetzer: These are really good, tough questions. so I've been pretty vocal lately that I think the people who are gonna experience the most disruption are gonna be entry level and middle management. And what I mean by [00:24:00] disruption is I think they're gonna be the ones that are gonna have to, very aggressively pursue AI literacy and demonstrate their capabilities with ai because.

[00:24:12] Right now the most valuable user of a chatbot is someone with years of experience and domain expertise who knows how to talk to it and ask the right questions and knows how to like continually prompt it and then assess the output of what it gives them. And that's what entry level people and often middle management lacks, is that deep domain knowledge that makes them really good at working in collaboration with the ai.

[00:24:36] Cathy McPhillips: Mm-hmm. 

[00:24:38] Paul Roetzer: So, I think that. The years of experience right now favors people who can also work with the ai. And that's why I say like when you're entry level middle management, I think you just have to above and beyond prove your ability to work with these tools, at, at the highest levels. [00:25:00] So that it's never a question of, of that I 've said before, like, I struggle right now to actually figure out what the entry level roles are at SmarterX.

[00:25:08] And I would gladly hire as many entry level people as I could. I just don't know what to have them do right now because most of what we would historically hire entry level people to do the AI can do. And I don't like saying that I'm not, I'm not saying it because I don't want to hire as many humans as possible.

[00:25:24] I'll, I'll keep hiring as much as we can, but I'm also not gonna hire someone that is gonna come in and be obsoleted in six months. 

[00:25:33] Cathy McPhillips: Right. 

[00:25:35] Question #8

[00:25:35] Cathy McPhillips: Okay. Number eight, should companies be doing more to protect their AI leaders and superstars, possibly even through contracts as AI capabilities continue to advance?

[00:25:46] Paul Roetzer: Yeah. I don't, I'd have to think of like what could be, what could be included within here, but I would say there's a lot of complexity here. I don't remember if it came out in the marketing talent AI impact report. I know it [00:26:00] was at least discussed in our. Group settings, so I'll throw it out. There is, one of the council members brought up the idea of, you know, if one employee, so let's just say you, you've got, I'll just like pick some numbers here.

[00:26:14] Mm-hmm. Let's say you've got five people at the director level who are all making 150 plus a year. One of them is nights and weekends, building GPTs, building prompt libraries, sharing those with the team. One of those GPTs leads to a new line of business that generates a million dollars in revenue. Like, does that person get deserved to get paid the same 150 as the other directors?

[00:26:40] Hell no. Like, and they know that, like they, they know the value they're creating. And so I think that's the challenge is when you have these people who. Take these leaps and start really driving innovation and growth through their work with ai. They're probably happy as hell that they're getting the opportunity to do this.

[00:26:59] They're probably [00:27:00] like feeling more fulfilled in their job, but they're also gonna look around and be like, wow, I'm creating a disproportionate amount of value in this company right now. So that came up in our council meetings of like, how do you compensate that person like, and I think that council member in particular.

[00:27:14] Was treating it more as like one-time bonuses. So it's like we can't just change our pay scale like you're still a director, but if you create something that creates disproportionate value, you're gonna get a one-time bonus of 50,000, like things like that. So I do think we need to really start contemplating when people do show, you know, true ability to make a difference through their own investment of time and energy to become.

[00:27:40] like to master, you know, AI use in the company, they have to be compensated for that. However you do that, stock options, bonus plan, whatever it is. And again, I don't know very many companies that have solved for that yet. They're just starting to ask that question. 

[00:27:54] Cathy McPhillips: Right. 

[00:27:56] Question #9

[00:27:56] Cathy McPhillips: Okay. Number nine. I manage a small team of media buyers and we're not allowed to use AI tools with confidential data.

[00:28:02] How can teams like mine still get real AI experience? 

[00:28:07] Paul Roetzer: I would try and find ways to anonymize the data so it's not confidential, like so, so you don't run into that issue. That would be my first reaction. Is there, is there a way to get the proprietary information out of it where you're just trying to inform the media buy?

[00:28:24] The second option would be, if that is core to your business, like if, you know 50% of your revenue comes from media buying. Then I think it would be a hundred percent worthwhile to talk to someone internally in you know it, or to talk to an outside advisor or consultant who could build you a proprietary tool on open source technology or you know, something that lives within your walls that isn't going to these model companies.

[00:28:50] To where the team is confident that you can now use the technology without concern of leakage of proprietary information. So it, yeah. If it's core to [00:29:00] your business, I wouldn't stop at No, you're not allowed to. It's like, okay, well what's plan B? How, how can we make this safe to do so that we're allowed, we're able to do it.

[00:29:09] And one way to prove that if you need to make the business case is use dummy data, go into ChatGPT or Gemini and say, I wanna prove a business case for building a media buying tool. My team is concerned about confidential information. Here is a template of what a database looks like, what a spreadsheet looks like, the kind of information I would give you.

[00:29:29] Can you fill this out with dummy data that we can use to build a sample project so I can go to my executive team and say, look how much time I could have saved. Look at the insights we could have had. this is all dummy data. It's safe. So I think you gotta think about how to make that business case. I wouldn't give up though if it's, if it's core to your business, it's worth.

[00:29:49] The time for sure. 

[00:29:51] Cathy McPhillips: And I was in planning and buying like six careers ago, and it was like, I mean, I'm thinking about competitive research, sentiment [00:30:00] analysis. Just deep research, learning more about your customer. I mean, there are so many things you could be doing that don't involve data at all. 

[00:30:06] Paul Roetzer: Yeah, and just for, if somebody's maybe not in this industry, media buying literally just means like someone who's like, okay, we got a million dollars to spend to promote our product, our brand, where should we spend it?

[00:30:17] Online channels, social media, Google ads. Like that's what a media buyer does. They figure out how to allocate that spend for the, you know, the best cost per thousand, you know, best impact. 

[00:30:27] Cathy McPhillips: That's what I used to say and I was like 24. I was like, I spend other people's money for them. It's 

[00:30:33] Paul Roetzer: true. 

[00:30:35] Cathy McPhillips: Okay. 

[00:30:35] Question #10

[00:30:35] Cathy McPhillips: Number 10.

[00:30:36] If we zoom out chat, GPT moved from 3.5 to 5.2 in about three years. If AI agents are at a chat GBT one stage today, how fast do you think this could advance? Could we realistically see an agent's GPT-4 moment within a year? 

[00:30:50] Paul Roetzer: Yes, I wouldn't say they're at GPT one. I would say they're probably at. ChatGPT 3 maybe at this [00:31:00] point, this is gonna be uneven.

[00:31:03] The way I've explained this is agents are being built to be more autonomous and more reliable by industry enroll. So it's not gonna be like we flip a switch in. GPT six is all of a sudden a universal agent that works the same for every profession. Right now it's AI researcher. They're all trying to build AI agents that do the job of an AI researcher.

[00:31:24] 'cause it is the most compounding value thing they can create. Once they've automated AI research, now they can massively scale up their own labs. But what also then happens is you pick verticals and venture capital firms fund the building of agents in specific verticals. A popular one right now would be the legal industry.

[00:31:44] Harvey is the major player there, worth billions of dollars. They're basically taking the core model and they're doing their own fine tuning and training to make it specialized in the legal industry to do the job of an attorney or an associate, within a law firm. So the [00:32:00] same thing could happen to any industries.

[00:32:02] People can be doing it for marketing, industry, sales, operations. You can go into healthcare, all these things. So I think it'll take time before we look around and say, wow, like AI agents just didn't, regardless of industry, we're just at that moment where they're just better than humans and I'm gonna start hiring agents instead of people.

[00:32:20] I don't think we flip a switch at the middle of this year and just like these universal general agents exist. the other thing I've said recently is even if they did, like, let's just say somehow someone reaches AGI later this year, Anthropic Google, whomever, and they have built a universal agent that can do everybody's work better than the average human.

[00:32:42] If you look at how long adoption curves are taking for just basic. Chat experiences just to use text in text out that we've had for over three years now, and most organizations are still at the piloting phase at best. I think if we had a universal [00:33:00] AGI agent this year, it could be 2028, 2029 before most organizations even fair out.

[00:33:06] What the hell to do with it? Like so I just. It's gonna be uneven in terms of when it's available to different industries, and it's gonna be extremely uneven in terms of when it's actually adopted and changes industries and companies. 

[00:33:20] Cathy McPhillips: Okay. 

[00:33:21] Question #11

[00:33:21] Cathy McPhillips: Number 11, how are organizations managing agent quality and decisions around LLM selection Amid constant change, I keep thinking about McKinsey rolling out tens of thousands of agents, and then openAI's changing default models.

[00:33:35] How do teams avoid chaos? 

[00:33:37] Paul Roetzer: I don't have a good answer for this, nor do I think the IT departments at these major companies have a good answer for this. This is like one of those things that back in 2022, I sort of looked out ahead and I tried to like project what was gonna happen and I had companies calling me and saying, Hey, you know, should we spend the $3 million and build a proprietary, you know, internal LLM and things like that, or, you know, [00:34:00] customize an LLM based on, you know, GPT-3 or whatever.

[00:34:03] And it was so hard to provide guidance, then it's hard to provide guidance now. one of the things you have to be conscious of is whatever is available today that you may stop and spend tens of thousands, hundreds of thousands, millions of dollars, building a custom version of, proprietary thing internally.

[00:34:23] It's outdated by the time it's built, like, and I can't tell you how many meetings I've been in where people are demoing for me, the proprietary things that they built. And I'm like, oh, does it do this? Does it do this? Does it do this? They're like, no, no, no. We don't have those capabilities. It's based on 2.5 model.

[00:34:39] I'm like, what's what? As good as that, like all your employees are just gonna go and use the GPT anyway. Like you, you've neutered the thing. There's nothing left in it to do all the things they know are possible. So I empathize with anyone who has to make these decisions. It's super hard. and then you deal with the issue that, I don't know if we have a question about this, but [00:35:00] I brought up the issue of credit based pricing on this week's podcast, and I can tell you I heard from some people who it's worse problem than I thought.

[00:35:09] Like I was looking at it as a SmarterX, like we're a smaller company and like, oh, this crazy. You have all these unexpected costs and like we had to shut off our AI tool for five days. I've heard from people who have it way worse than us at big companies. And so you start dealing with that. It's like, even if you don't build your own thing, what if you're using the API or you're, you know, using a company that's doing credit based pricing on access to their agents and things.

[00:35:32] It is, it's such a dynamic space right now. I don't think anyone has a great answer for this. and like I said, I sort of like, I feel for the it t it teams and then the, like the department leads who are trying to maneuver through all of this. 

[00:35:48] Cathy McPhillips: But because you're, you can't wait. 

[00:35:50] Paul Roetzer: You can't. And I mean, like we advised one company that had like a massive play, with Microsoft was their main provider.

[00:35:59] And so the IT [00:36:00] team is working on this like massive build millions of dollars and by the time it's gonna be built, like it wasn't gonna do half of what we knew they could get from a ChatGPT team license. It's like, let's just go get you 20 licenses for a chatGPT team. You could be up and running tomorrow.

[00:36:15] Like we can build GPTs for like everybody on your team by next month. And so that's what we did. Like, it's like the IT team's doing their thing, taking forever, going through all this risk management, which they need to do. Microsoft's building all this custom stuff, spending millions of dollars, like doesn't do anything.

[00:36:33] and here our team is like a month later, they're running circles on everybody in the company because they just went and. It's hard to advise against that approach. Like I think, and it's why I get a lot of pushback from some of my friends who are big on, you gotta build a proprietary thing. You can't risk data leakage into these model companies.

[00:36:51] Like, I get it. But the , the opportunity cost of waiting and then building something that's obsoleted six months after you [00:37:00] spent the $3 million, I'm sorry. Like you better make a really, really strong business case to me to not go with the more dynamic route because. I spin it up for 20 bucks a month per user when I can, I know for a fact I can generate 10 to 20 x return on that 20 bucks a month at minim 

[00:37:17] I'm sorry. As a CEO, you, you're, you're probably not winning that argument. 

[00:37:22] Cathy McPhillips: Yeah. Okay. 

[00:37:24] Question #12

[00:37:33] Cathy McPhillips: Number 12, generalists who can connect the dots are highly valued right now. How long do you think that advantage lasts and when does the pendulum swing back toward deep specialization? 

[00:37:34] Paul Roetzer: I think I asked, answered a variation of this question on the marketing talent one.

[00:37:40] and I think the guidance I roughly gave is, I'm a big fan of generalists right now for sure. I think I use the analogy of if my kids were heading into college, I would strongly encourage a liberal arts college because I think diversity of knowledge and experience is going to be very, very.

[00:37:58] Important. [00:38:00] We talked this week on the podcast about the ai, constitution that Anthropic has built. The lady who largely built that Amanda Ascal. she is a philosopher, like she created one of the most important documents in AI today, and in retrospect, could end up being one of the most important documents in AI ever.

[00:38:24] And she's a philosopher. Like she's not an AI researcher, she's not a specialist in that area. She focuses in ethics and philosophy and like, that becomes super important. And so I just feel like we don't know what the future looks like. We don't know what remains uniquely human. And so it's really hard to specialize in one area of study or one specialized career path that the AI might be better at in 12 months.

[00:38:52] So. I don't know. I mean, if you go back three years, I would be like computer science, like just become a programmer, become an AI [00:39:00] researcher. And I don't know, like Anthropic doesn't seem too hot on them needing any of those things in three years that they haven't already hired. Like, I don't know they're gonna be hiring those people.

[00:39:12] So, I don't know. I think, generalist is, is more stable to predict. But I t could swing at any moment, I guess, but it, I think it's gonna have to be very specific. the specialist thing I got, I'd have to think deeply about what specialist roles would continue to carry enormous value when the AI is, we have to assume capable of being at or above expert human level at everything.

[00:39:40] Cathy McPhillips: Yeah. When I was listening to the podcast and you were talking about Amanda, it made me think of, this was 2022, I think MAICON, one of our speakers worked for an AI tech company and she was on a panel with Mike, and she's a linguist. Mm-hmm. That she's a trained linguist and was working for this AI company.

[00:39:58] I'm like, that's really valuable. So, I [00:40:00] mean, there are certainly jobs and niches and things like that that really can work well for you. But again, that's a liberal to arts degree. Right. 

[00:40:09] Paul Roetzer: Yeah, and I, again, I think like you just have to be able to make connections between seemingly unconnected things. Like it's something I always look for in the hiring process is like, who, who makes the connections?

[00:40:20] Like they read a book about philosophy or they read a book about, I don't know, whatever psychology, and like they, they connect the dots of like, oh, so our customer journey should actually, we didn't factor this in. We were thinking about the customer journey and like the needs of the individual. And so I've always been a huge advocate of hiring people outta liberal arts schools because I wanted that diversity of knowledge and you just never knew when something you learned in economics class was all of a sudden gonna become super relevant to what we were doing in, in marketing.

[00:40:48] Like, so yeah, I'm, I'm a big fan of Generals. 

[00:40:52] Cathy McPhillips: Should have done better in economics before. 

[00:40:54] Paul Roetzer: Oddly, I hated math. But like two, the two classes I excelled at in, [00:41:00] in college was statistics in economics. And I avoided math like the plague in college. But 

[00:41:06] Cathy McPhillips: stats, I love math. I love stats. The 

[00:41:08] Paul Roetzer: best 

[00:41:08] Cathy McPhillips: economics I would love to take it now to see, because I that point wasn't interested.

[00:41:14] Yeah. Okay. 

[00:41:14] Question #13

[00:41:14] Cathy McPhillips: Number 13, beyond listing courses, how can professionals show their AI forward on resumes or LinkedIn, especially when it's hard to demonstrate things like creativity, empathy, and judgment. 

[00:41:26] Paul Roetzer: Will build something. Like if someone came in for an interview and they said, Hey, you know, I've built these five gpt for my personal life.

[00:41:33] Like you and I, Cathy, were just talking this morning about a trip planner. Like both of us have planned trips in the last week and we both used Gemini to do it. So it's like, oh, I built a GPT for that. I built one for my, my diet and exercise. I've got a GPT for my helping my kids in school. It's like, okay, cool.

[00:41:47] Like you're problem solving. You're, you're using AI in an interesting way and , and then like, here's one I built to help me with my job. so I think that, and then the other thing that's becoming super accessible is building [00:42:00] apps. I've talked recently about lovable and how I've been kind of experimenting with that to build some things.

[00:42:04] the re are no barriers to building anything. And so that to me is like, to be able to demonstrate that in an interview of something you built and be able to talk about that, why you built it, what problem you solved with it, how it helped you become more efficient, make, make better decisions, drive innovation in your personal life, your business life.

[00:42:22] that is the gold standard to me right now. I love to see you took courses and read books and have certificates, and that's all important, but the fact that you built something, especially if you didn't have to, wasn't required of you to build it. Right. And you did it anyway. It's good stuff. 

[00:42:37] Cathy McPhillips: But I think with the empathy and judgment, you know, all those bullets that were on your resume or CV before, you know, managing a team, doing X, Y, and Z, those certainly matter.

[00:42:48] Paul Roetzer: Yeah. And empathy. I don't know. I've honestly never thought about How do you demonstrate empathy? how do you interview for that? There's, I'm sure there's thousands of people, way more qualified to answer a [00:43:00] question about how you interview for empathy than, than I am. I'm, I'm just thinking out loud. But like, we used to have this back in the, my agency days, we used to have this car ride test.

[00:43:10] We, we do a lot of car trips. And so we have to go visit clients like six hours away and you have to spend six hours each way with like people in the car. And so we literally used to have this like, would you, would you want to take the road trip with them? Like the road trip test? And part of that was just like, is it a fit culturally?

[00:43:27] Like are they a good person? Does a caring person that you want to be around and like talk to and like I'm not asking specific questions to get at empathy. I'm just trying to figure out is it a good person and. That became like one of the big filters that me and Tracy would have. Like, do we wanna make this higher?

[00:43:42] It's like, yeah, I'd love to go on a road trip with them. They're fascinating. Like they, they're good people and they have interesting things to talk about, but I think empathy, 

[00:43:49] Cathy McPhillips: that was my second day at work. We went to Athens. Is that, 

[00:43:51] Paul Roetzer: that's true. 

[00:43:52] Cathy McPhillips: Was that all planned? 

[00:43:53] Paul Roetzer: We did, yeah. You passed the test. yeah, we talked about AI Academy.

[00:43:57] That whole ride on the way to happens. We did, didn't we? Six [00:44:00] hours there and back. I don't know, like maybe that's like. What are they involved in? Are they on boards of nonprofits? Right. Do they get involved in volunteer activities? Like there's ways to get a sense of like, is this an empathetic person who cares about others, not just themselves like, but I also feel like so often you can just tell like if you're a good judge of character, you know, within two minutes, like, is this a good person?

[00:44:23] Cathy McPhillips: Right. Okay. 

[00:44:25] Question #14

[00:44:25] Cathy McPhillips: Number 14, how can brands retain trust and authenticity when using AI generated video, audio, images, or copy? 

[00:44:33] Paul Roetzer: I think you just have to be transparent about everything you're doing. The authenticity thing is critical. I 've talked a lot about that on recent episodes of, I think it came up in the marketing talent.

[00:44:43] one, and by the way, we keep mentioning this marketing talent, AI impact study. You can download it. We'll put the link in the show notes. It's, it is available now. It's a great read. It's one of my favorite pieces we've ever done. yeah, I think if there's an expectation of authenticity, you have to show up as a human.

[00:44:59] Like, I [00:45:00] don't, I don't know how else to say this. Like, and I use this example for our own stuff with AI Academy. You know, the question will obviously come up at some point or has like, Hey, would we ever use AI avatars to, to create our stuff? And my answer was like, absolutely not. Like we, we, like, they're coming to our academy for us and it needs to be us.

[00:45:20] Yes, we could save two hours if I created an AI avatar in HeyGen or Descript and had it read a script that ChatGPT wrote with, like, if you found that out as a customer, like you would feel like. It was just cheapened. Or if you found out that this was my AI avatar sitting here talking to you and saying all this stuff unscripted, like it would just lose something completely.

[00:45:43] So I think the litmus test is, is there an expectation of authenticity that you wrote, the thing that you're presenting, the thing. Then you gotta show up in an authentic way, even if it means you're spending more time and you know, the AI could save you time. Sometimes the time is the [00:46:00] point, right? Like the fact that you invested the energy to do the thing is actually what makes it worthwhile.

[00:46:06] It's why human art differs from AI art. I'm not saying a art isn't creative. If you see a ar, ai art and it inspires you in some way, and you find it to be creative. Buy it. But if you want to know that the things someone created, and I say this from a personal place 'cause my wife and my daughter are artists that they spent three months doing the thing and that this thing that happened in their life is what led them to make the thing.

[00:46:35] Like, I'm buying that over the AI thing 10 times outta 10. Like, because it's authentic and I wanted it to be that way. I wanted a human condition behind it. It doesn't mean I won't see AI stuff that I think is awesome and that I share like 

[00:46:48] Cathy McPhillips: Right. 

[00:46:48] Paul Roetzer: So I think that's the biggest litmus test is if authenticity is expected, then you gotta show up as the human.

[00:46:56] Cathy McPhillips: So when Macy who runs our community and social, when she [00:47:00] started we were talking about AI usage and where, where it makes sense, where she was allowed permitting, you know, based on our guidelines internally and everything. And it's like there are some places I expect her to be using ai. 

[00:47:10] Paul Roetzer: Yeah. 

[00:47:13] Cathy McPhillips: You know, taking a blog post and creating social, like we, she doesn't need to write every single word of those things.

[00:47:18] What she can't do is have an AI bot running in our Slack community or things like that. Like all the time she's saving, doing this is helping her be more human and authentic in other places that are such so much more important. 

[00:47:30] Paul Roetzer: Yeah. Another one of my favorite examples is Clara on our team, when she created like, it was like a minute and a half video for MAICON 2025, and it was to build on this move 37 moment that I told the story of in my opening keynote.

[00:47:42] And it was all ai, but it was also all Claire, like she's the only person on our team who could have done it because she's a storyteller by trade. She has a background in video and graphics and like. It was amazing and it was very powerful, but you also knew it was ai. We did not hide that at all. We actually featured [00:48:00] the fact that it was ai, and yet there's a human storyteller behind it that I thought was just like this perfect blend of AI and human, where that was, it was still authentic to me because I knew what she put into it to make it come to life.

[00:48:13] And it wasn't just a prompt and like, here's a minute and a half video. I was like, no. It was hours of her time and her whole life of being able to like. Go in and do the thing that you and I couldn't have teamed up and done, Cathy. 

[00:48:24] Right, 

[00:48:24] Paul Roetzer: right. 

[00:48:25] Cathy McPhillips: And she had those artist moments of like throwing her hands up, walking away.

[00:48:29] Yeah. Like this isn't, this isn't right. I gotta go back and fix it. So it was a whole process for her. 

[00:48:33] Paul Roetzer: Yeah. 

[00:48:34] Cathy McPhillips: Okay. Our last question, 

[00:48:35] Question #15

[00:48:35] Cathy McPhillips: number 15, if a company is considering a third party AI powered SDR solution, what questions should they ask vendors and what would it take to build this capability in-house instead?

[00:48:46] Paul Roetzer: So this is actually a great one. I'm not taking like the easy way out. I would put that question into ChatGPT or Google Gemini or Anthropic Clog and just ask it. I mean, this is what, when I have to do things like this, I gotta talk to my attorneys, I gotta talk to my accountant, I gotta talk to like [00:49:00] outside vendors.

[00:49:01] I'll go and be like, Hey, what question should I ask this person? So one, I'll say, it's a great one to throw in there as a prompt and see what it recommends to you. But I think for me, the big key is, you're gonna have to define like a workflow. You're gonna have to have trust in the solution. You're gonna have to know what guardrails are in place to make sure this thing doesn't go off the rails and destroy brand equity you have and brand trust that you have with people in your industry or your perspective buyers, things like that.

[00:49:30] So I would say you have to have the same level of confidence in this solution as though you were hiring a human to do the same thing. Like what would you need to know of that person? What processes would you have to put in place? What guardrails would you want? what level of oversight would you want?

[00:49:46] Like when is it okay for it to make a decision and do a thing versus needing human in the loop approval to do a thing? So I, that's, that's probably like the simplest way I would look, look at it, is, create this process as though you're hiring an [00:50:00] SDR and then translate that over to your hiring an AI instead.

[00:50:03] And what are the other, you know, limitations it might have or guardrails it might need in terms of how, to build the capability in house? I don't know. Like I'm actively trying to actually build this capability for us because we do not have SDRs and I, my instinct going into it, so my hypothesis is that I think I can automate like 90% plus of what an SDR would do for us with heavy human in the loop for the last 10%, kind of that last mile.

[00:50:31] I think that most of the work, the admin level work of the SDR, can likely be done by it, and I don't think it'd be overly complex for us to build it ourself. That being said, I don't know. That's it. It's all like more of a hypothesis at this point. 

[00:50:48] Cathy McPhillips: Sure. 

[00:50:49] Paul Roetzer: probably depends on what existing tech you have access to, but there's great solutions out there for this stuff.

[00:50:54] I would obviously talk to your CRM company as a starting point, and then certainly there's third party technology you could [00:51:00] layer in pretty quickly. While you're building your own capability if you wanted to. 

[00:51:05] Cathy McPhillips: Amazing. All right, that's our 15 questions. 

[00:51:07] Paul Roetzer: Three minutes to spare for my meeting. 

[00:51:09] Cathy McPhillips: Well, we will throw a ton of note, links in the show notes, and if everything from the talent report, we put out the blueprint series, webinar links, AI for agencies, MAICON, if you are interested in speaking, if you're interested in attending, we would love to have you at all of the things.

[00:51:27] Awesome. And Mike and Paul will be back next Tuesday for the regular scheduled show. 

[00:51:31] Paul Roetzer: Yeah. Thanks for the great questions everyone, and thank you Cathy and Claire for coordinating everything. We will talk to you next week. Thanks for listening to AI Answers to Keep Learning. Visit SmarterX.ai where you'll find on-demand courses, upcoming classes, and practical resources to guide your AI journey.

[00:51:51] And if you've got a question for a future episode, we'd love to hear it. That's it for now. Continue exploring and keep asking great questions [00:52:00] about ai. 

Recent Posts

[The AI Show Episode 194]: Agentic AI Timelines, Generalists vs. Specialists, Resume Tips, AI Learning Ownership, & Handling Model Updates

Claire Prudhomme | January 29, 2026

In Ep. 194 of The Artificial Intelligence Show, we answer your top questions on AI failures, the rise of generalists, agent timelines, and more.

[The AI Show Episode 193]: AGI Talk at Davos, Amazon Layoffs, AI for Course Creation, OpenAI Cybersecurity Warning, New Claude Constitution & Credit-Based AI Pricing

Claire Prudhomme | January 27, 2026

In Episode 193 of The Artificial Intelligence Show, we break down AGI timelines from Davos, Amazon's new layoffs, a new White House AI report, and more

49% of Professionals Wary of ChatGPT Ads (Informal Survey)

Mike Kaput | January 27, 2026

Our latest AI Pulse survey reveals a complex split in how professionals are processing the commercialization of AI.