Automate Your Agency

Claude Called It Cowork for This Reason…

Alane Boyd & Micah Johnson Season 1 Episode 96

Use Left/Right to seek, Home/End to jump to start or end. Hold shift to jump forward or backward.

0:00 | 34:38

Send us Fan Mail

When you think about AI workflows, you typically think about achieving success when there are no humans involved. But Alane Boyd and Micah Johnson call this "over automation," and it's costing businesses time, money, and sanity.

The hosts have seen it repeatedly: companies build elaborate "hands-off" AI systems only to spend more time fixing what AI got wrong than they would have spent doing it manually. The problem isn't the technology, it's the approach.

In this episode you will learn:

  • Why 80% is the automation sweet spot and 100% creates more problems
  • The difference between automation and AI automation and why it matters for your workflows
  • How to design systems with human judgment built in without losing efficiency
  • Why micro-automations beat monolithic systems for maintenance and reliability
  • Real examples from Goldman Sachs showing how even Wall Street keeps humans in the loop
  • The two camps getting AI wrong and how to find the middle ground that actually works

If you are building AI workflows or considering it, this episode will save you from expensive mistakes and show you how to get the balance right. 

The real question is not "how much can I automate?" It is "where do I still need to be the smart one?"

Tools and frameworks mentioned: 

  • Claude
  • n8n
  • ClickUp
  • Zapier
  • Make
  • Pipedrive

To sign up for our AI News Brief, sign up at the Biggest Goal community.

Upcoming Webinars & Workshops:

Learn More about our AI Leadership Workshop

Winning businesses aren’t just working harder, they understand how to use AI strategically. Sign up for the AI Leadership Workshop, a 3-hour expert-led mastermind designed to give leadership teams clarity, alignment, and a practical path to stay competitive.

Book a call to customize your workshop today!

This episode is brought to you by Biggest Goal.

Every quarter your team spends evaluating AI is a quarter your competitors spend shipping. Most leaders feel the pressure but get stuck between ignoring AI and getting it wrong. More tools and more demos won't fix it. What actually works is hands-on training for the people doing the work.

That's what we build at Biggest Goal. Our workshops align your leadership team on a shared AI roadmap. Our 4-week cohort get your teams building real AI agents, and automation workflows. And our ongoing groups keep leaders and builders sharp long after the cohort wraps.

Enjoy our Free Tools:

Connect with Us:

Disclosure: Some links are affiliate links, meaning we may earn a commission at no extra cost to you. Thanks for supporting the podcast!

For more information, visit our website at biggestgoal.ai.

Alane Boyd (00:02)
When you think about AI workflows, you typically think about achieving success when there are no humans involved. But there's another term for that, over automation. When you remove the human layer, you could be building a system that is actually more headache than just doing it manually. And we're gonna talk about that today.

Micah Johnson (00:23)
I mean, the bottom line, Alane, is when you're automating with AI, you really need humans involved, but it's a different use case. You don't need them to do all the manual work anymore. And we talked about this in a recent episode, you need them for the judgment. I thought it'd be really fun to explore where this could go wrong and why, you we said, here's what you need, but we didn't really talk about the why in depth.

Alane Boyd (00:52)
Yeah, I thought something that you said to me recently is we still need humans, so we need to design with humans in mind.

Micah Johnson (00:59)
Yeah, definitely. anybody who's listening that's been using Claude lately, yesterday they released as of this recording, they released Opus 4.7 and ⁓ leading up to that Opus 4.7 release, we could see a over everything general decline of Opus 4.6. It just got dumber for lack of a better word.

And it was noticeable by many. There's a lot of people online that were commenting about it. I know I kept sending you Slacks, Alane, going, what the hell is going on with Opus? It's driving me crazy. And it really, it even affected my dreams because you know, we're in this right now. And when something operates at a level and then it gets worse, you start to realize, wow, this

Alane Boyd (01:32)
you

Yeah.

Mm-hmm.

Micah Johnson (01:53)
human judgment piece is such a must have that we're not all losing our jobs anytime soon.

Alane Boyd (02:02)
And really kind of how fickle it can be too, where we saw, I mean, this was for days, Micah, you were working with Cowork on a couple of things and saying like, oh my gosh, this is just not working. It's so dumb. And you had to put more effort as a human to close that gap because the AI wasn't producing. And also it went down yesterday.

You know, we can't control that piece when the AI's go down. So we still have to have an expert being able to move things forward or we are going to be controlled by it.

Micah Johnson (02:38)
Yeah, and I do want to put a big caveat on this entire episode. Working with AI, even with these issues, is still better. I would not trade it. I'm not looking at these going, it got dumber for a week, so let's trash this whole thing. It is still 100 times faster, better, easier. I do not want to go back to manually creating everything that we had to create before.

in tasks and comments and deals and the cross-referencing. I don't want to go back to that world. But what you pointed out was perfect, where it would be maybe one or two prompts. Like this sounds so petty now that I'm saying this out loud, Alane, but I might've had to do like five prompts. ⁓ the agony.

Alane Boyd (03:22)
Hahaha

Yeah, I've been going through that too, because Polly, our proposal system wasn't working for a couple of days. Instead of the automation with the AI working, I had to just go straight to AI, still hours and hours faster. But I just closed the gap. I, as a human, knew what to do and went and did it and still was able to finish up the proposal. But as a whole,

Micah Johnson (03:43)
Yes.

Alane Boyd (03:51)
the AI automation wasn't working. And so, you know, that's where a human still needs to be thought of as part of it. And they still need to know what is happening so that when things go down, they can step in.

Micah Johnson (04:04)
Basically how it works or where, where humans fit in. And, know, we, we chat with a lot of people and a trend that we see. In fact, we got an email on this. Was it yesterday, Alane, maybe a couple of days ago, you know where I'm going with this. Don't you? Yeah. It was still the, how do I feed everything in my entire business to AI and let it make all the great decisions for me.

Alane Boyd (04:18)
yeah, couple of days or something ago. Yeah.

Micah Johnson (04:31)
If you're listening and you sent us this email, please do not do this. Whether you work with us or not, don't pay somebody to do this for you. I think there's a little bit of like headline fatigue and like maybe brainwashing that we get in where it's like, if it's automated, that means no humans. And then if it's, if it's not successful, then it's a failure and

Alane Boyd (04:49)
Mm-hmm.

Micah Johnson (04:56)
You know, a lot of times I kind of want to think about it as a sliding scale. If it's 0% automated to 100% automated, the sweet spot is 80%. We're still in this like 80/20 rule.

Alane Boyd (05:10)
Absolutely. I was thinking about there's really like two camps I see of people where they want to automate everything with AI or they're scared to start doing a lot with AI because they don't want to lose the human aspect. And there is absolutely a middle ground between two of those things. Or maybe it is like 80/20. So

80% of it is, ⁓ that could be automated, AI could be part of that, but you don't want it to be 100%. You don't wanna lose the human part. You don't wanna lose the human interaction. The AI automation piece comes in for what you've already figured out and is a known consistent piece of the workflow.

Micah Johnson (05:55)
Alane, I've never thought about that on the same spectrum. So the people that say I want to automate everything, they need to back down to the 80% a little bit. And the people that are like, I'm not even going to touch this because I'm scared to lose the human aspect. They need to bump up to the 80%. And I mean, we'll talk about this in a little bit, but it's the whole concept that.

you're not actually losing any of the human aspect because that's where you need the humans in the system, is to keep the human aspect. What you want to lose is all that time-consuming grunt work.

Alane Boyd (06:32)
Yeah, I feel like this has been a thing that I've been talking about a lot lately. And it's like the value of humans, isn't moving data between software. The value is our brain's thinking strategy, how we understand tone, I mean, I remember leaving a meeting recently and we're divvying up like, okay, this is what we need to get done for this.

this workflow, this, this and this, and no, we're not doing this. And one of my team members was like, well, why we talked about it and they seemed interested. I said, but they weren't. You have to be able to read even though their words said something, their tone went along with it in a different way. And that's where experience comes in. Humans get that experience listening to that tone of voice and how they say things, what is really important in a conversation versus something that is

maybe not even maybe a nice to have. It's like such a low priority.

Micah Johnson (07:28)
And

if you're in sales, you see this all the time. Good salespeople can read between the lines, understand the tone, understand when somebody says, let me think about it. They're like, crap. I've totally lost this sale. Whereas a bad salesperson goes, they said they're going to think about it. This is a done deal. I can't wait to, you know, buy my house in Maui from closing this deal. And that's what we see.

man, Alane, I'm having so many revelations already in this conversation. We didn't prep any of this for those who are listening. You know, AI doesn't have that. AI goes, ⁓ they said they're going to think about it. This prospect will probably think about it. And maybe there's some other contextual clues in writing. But we all know if you've

ever texted, if you've ever received emails, and one person is like super happy and giddy. And then when they wrote it, and the person who reads it is like, ⁓ she is pissed off. And it's just because that written word does not translate this stuff.

Alane Boyd (08:40)
Yeah, it's really where that human judgment piece comes in. And we've had an episode about human judgment, but you have to build the automation with that in mind. Whether you're using Claude you're using n8n or some other AI automation tool, you still have to leave room for that. And Micah, when we were thinking about this episode, I was thinking about how we plan our AI agents.

and how we plan even our scheduled tasks and Cowork to do some of the automated things personally is they're micro. Some of them are micro automations with agents because, so we have 41 agents, and we've talked about this before, we have 41 agents who work for us, and they're broken up because we want to have a piece where, the human needs to be a part of this, then the next trigger can happen that the human

Micah Johnson (09:16)
Mm-hmm.

Alane Boyd (09:34)
can be responsible for, but there has to be some oversight.

Micah Johnson (09:39)
Yeah, absolutely. mean, you're going to get into, for lack of a better word, a trap if you try to go the 100% automation route. And the caveat is with AI, right? If you're doing regular automation, sure, there are definite cases where you can have this run behind the scenes. You do not need a human in there. When you involve AI, that's where you can get stuck in this trap. And I can all but guarantee if you try to do a full automation,

with AI. The trap is you're going to spend the time you saved, if not more, fixing the outcomes of what AI guessed, didn't have the context on, went the wrong direction because it misread the transcript and the tone and all of that stuff. Depending on what you have automated, you're going to be fixing that outcome, which is messy and difficult versus 10 seconds of reading a summary plan.

Alane Boyd (10:32)
Mm-hmm.

Micah Johnson (10:37)
giving it some feedback and approving it so it has a clear, absolute perfect picture to then go do other things with.

Alane Boyd (10:46)
Micah, whenever you talked about the difference between automation and then using AI with automation, I was thinking, how can we explain what we're talking about here? And with automation, it is a very clear map of data moving between systems. There's no decision-making happening. This happens, it goes here, it goes here, and it goes here, and it's 100% mapped. When we had...

Micah Johnson (11:12)
100 % mapped

100 % logic, like if this, then that kind of stuff. The stuff that we all grew up with in our basic computer programming classes.

Alane Boyd (11:20)
Yeah, or your basic Make or Zapier automations before AI came and changed things. So when we're talking about automation, there's no decision making happening. That's already been decided by a human. So that's 100% data movement. What we're talking about is when you have AI as a part of it, that is making a decision, but in a formulaic way. So that would be the difference that we're talking about. So true automation doesn't have AI. going to be

done consistently the same way every single time. But when you introduce AI into it, where it is taking context from something, taking our transcript from this episode and building out the description and show notes, it is making decisions. So it needs to have a clear understanding of the context and the expectations. So when you have AI doing 100% of that for your workflow, you're going to get a mediocre or not great output.

Micah Johnson (12:18)
Yeah. I've got an example I want to work through on both the like 0% and the 100%, Alane, that I think will be very useful. But what you just gave with, using this transcript from this podcast episode to create the show notes. When you say decisions, what AI is actually making decisions on is word use in this case. There's no logic there that we program into our AI to say, mean, obviously the AI companies are, but

Alane Boyd (12:23)
Okay.

Micah Johnson (12:45)
⁓ and it's a math model, but you know, we're not able to control that particularly. It's the input is the transcript and the output is the selection of words that AI decided to use based on the transcript and the context that we gave it. Now that's the decisions, right? We're not, always talking about big company corporate decisions

Alane Boyd (13:02)
Mm-hmm.

Micah Johnson (13:10)
You know, do we reinvest into, this business unit I wouldn't give AI those decisions either, but I just wanted to point out like even just word choices are decisions that AI is making.

Alane Boyd (13:22)
Yeah, I remember reading an article recently, I think this was a few weeks ago about Goldman Sachs, it was from them. And their IPO process when they're putting together the package for a company that's gonna go to IPO, they used to have, I think it was like, I'm gonna get the numbers wrong, but just to convey the magnitude, it was like six people working for several weeks on putting this package together and now they're using AI and it's,

down to just like a few days and a couple of They still have people a part of the process, but it's this like grunt work or tediousness that has to be done that checks boxes partly that can be done by the AI or even just like some of the research. If you have the data there to give it, you know, we talk about RAG a lot. Well, if the data is there and you can create an AI friendly database.

then you have something for it to make those decisions off of, or at least collect the data and answers for you to make those judgment calls.

Micah Johnson (14:22)
Let's run through a quick scenario because I think it'll help kind of reinforce this fact, Alane, or these two perspectives and where being at that 80% mark makes really good sense. So anybody who does client calls, you typically have your call recorders now, your note takers, that produces a transcript. That transcript is gold because you can take that and give it to an AI and help it identify things out of that transcript.

If we go on the 0% AI side of things, you have to have somebody review the video, review the notes, manually write the follow-up, manually pick out the action items, manually find the upsell opportunities, manually pick out which items should be backlog items. It is all of that. And all of that takes human energy. Even if you divide that up across a couple of people, then you have

knowledge transfer issues that you've got to convey. Well, what client is this? Why are we talking about all of that stuff? We've all been through it in the past. That is 0%, right? You've got knowledge transfer, you've got manual time, all that heavy lifting is cognitive brain power that is not going towards being the human aspect towards the client. That's just prepping stuff to get towards the client. And then by the time you get all that done,

You're exhausted. You're like, just got to get this damn email out because now it's been hours and I'm exhausted. And so you write a quick email and you write crappy descriptions. Maybe if you even put a description in the action items in your project management tool, it's most likely just a task and an assignee and you just put a due date and you're like done because I'm so over this right now and I just want to go eat lunch or I, you know, I'm under the gun. There's so many other things that I got to do. That's your zero percent.

Alane Boyd (16:08)
Thank

Right. was underestimating the value of this at first because it seemed like it didn't save that much time and But I think I was too far removed from that part of the process for such a long period of time. And what I think is really amazing with some of this that you're talking about too is the

Micah Johnson (16:27)
Mm-hmm.

Alane Boyd (16:36)
the end result that we go to the client with is better than if we had just done it by ourself. It's so much more organized and clear. And then even going into the client meetings, we have these beautiful documents with the agenda laid out with examples of, now this still very much took a human though to hear what they were saying and then to tell the AI what we wanted as the end result. Like, hey, this piece was important and we want to map this out here.

It's got the whole agenda put together with talking points that we've decided on and then ideas from us coming out of that meeting like, hey, this is our value. Our value is how do we take what these clients' pain points are and turn that into a result that they can use and implement. And we go in with a four to five page document that we share with them and go through and be like, hey, this is what we're thinking. These are the opportunities that we see that you could change this and do this.

and it'd be impactful for you. And then, hey, we've also been keeping this backlog down here of ideas for team training or whatever, and we can edit these as we go along, but this is our log that we're keeping track of for you.

Micah Johnson (17:46)
So Alane, you are nailing it, but this is like three different solutions this is the thing, right? Like from your perspective, you're going, we started with this call debrief. And like I gave the example of the 0% and then already you've seen what we have our call prep. So we have the call debrief, we have the call prep, we have the

Alane Boyd (17:57)
Mm-hmm.

Mm-hmm.

Micah Johnson (18:09)
How do we convey what the prototype does and is and what does it look like as a diagram? So all these things that at the 0% level, one, we could barely get even nearly as good as we can now because it took so much time and effort and knowledge transfer. Before we go too deep in some of these other use cases, Alane, I really want to go onto the 100% level to illustrate why that also doesn't work.

Alane Boyd (18:25)
Mm-hmm. Mm-hmm.

Mm-hmm.

Micah Johnson (18:37)
And so if we were to just, if we take this call debrief line and say, all right, if we're debriefing a call, the technology is there. We could say, let AI review the transcript and then let AI create all the tasks, email the client back, backlog all these items, update the documents like you're talking about Alane, do all of these things and we never have to touch it.

Alane Boyd (19:00)
Mm-hmm.

Micah Johnson (19:00)
At first

glance, you're like, hell yeah. I never have to look at this again. I can just get off a client call, plug in a quick, command and everything is good to go. Or we could automate it as soon as the transcript is ready. The technology is there, but play that out a little further. And what happens when it gets an action item wrong? How do we unwind when we sent that to the client? Hey, your action item is to buy Micah.

Alane Boyd (19:14)
Mm-hmm.

Micah Johnson (19:27)
chocolate cake for his birthday because the AI heard chocolate cake and Micah's birthday is coming up and it sounded like an action item, but it couldn't put enough together to go, maybe that's not an appropriate action item to send back to the client.

Alane Boyd (19:35)
Mm-hmm.

I mean, that's so true. when we were talking about earlier with one of my team members misunderstanding what is an action item or not, the AI is picking up on that as well. It doesn't have that clear delineation or expertise. And so having where we have that piece. And so like you mentioned, like I was talking about a few different

automations that we have with AI, but they're not sending to the client. We're still oversight in each one of those three pieces before anything gets shown or sent. And that's that human design piece that we put in.

Micah Johnson (20:13)
Absolutely, absolutely. So that's where the 80% comes in. Let's get rid of that 80 % of manual stuff. Let's let AI do all the cross-referencing. Let's let it look at past deals and past email and notes inside of project management platforms. It's really good at doing all of that and then correlating and cross-referencing that against the transcript. Let's have it do some web searches and pull stuff from their website. And you know, then...

articulate a plan back to us. That gets us to the 80% mark. From there, we do the work of going, let's change this, let's edit this, let's take the buy chocolate cake for Micah's birthday out of the action items for the client. Let's do all of these things. Let's follow up with this. You know, that's going to be that remaining 10 to 20%, sometimes less. And it avoids the issue.

on both sides. You know, we're not exhausted and behind the gun mentally and time wise. And we're not doing this massive cleanup like, oops, didn't mean to send that. Or I didn't even realize it created these tasks. I didn't even realize it put it there. That's where a lot of the distrust comes from. You sit at that 80% mark and it's a beautiful system. Now we're not fatigued. We keep the human element. We actually get to do better work.

Alane Boyd (21:12)
you

Micah Johnson (21:40)
we provide better things to our clients and internally, just overall, it's that perfect scenario.

Alane Boyd (21:40)
Mm-hmm.

I think too, Micah, about how to maintain these systems or some of what we're talking about are Skills that we've developed in Claude Cowork. Some are n8n automations that have AI a part of them. So there's a bunch of different ways that's happening, but a lot of how we manage our AI automations are small because maintaining them is easier making edits.

Micah Johnson (22:10)
Mm-hmm.

Alane Boyd (22:13)
to them are easier, lest errors happen, because they're just little small units and it forces the human to be a part of it. But also, if you wanna update anything, I mean, I think about how often with knowledge bases at a company or SOPs, standard operating procedures, so often a company's like, yeah, we did that once three years ago, but everything's out of date, they're no longer relevant. That 100% happens with any type of automated,

workflow with or without AI can happen because things change. The way you do things changes, the way that you say things changes, how you execute on work changes. So you have to keep things updated. Well, it feels like a monstrous task when you have a giant automation. When you have smaller ones, it's like, okay, well, can just go into that agent or I can go into that Skill and do something real quick

Micah Johnson (22:58)
Yes.

And this gets magnified with like with traditional automation, you can build big automations with AI automation. Keep it small. Like you said, Alane, when you were describing all of those things that we're doing, it feels like it's all one system, but it is a call prep system. It's a call debrief system. It's a how this works documentation system. And those are all small things that we've designed to work together, but

Alane Boyd (23:21)
Mm-hmm.

Micah Johnson (23:36)
It's not a end-to-end monolithic, we're going to start this automation when we start talking to a client and then that automation runs all the way through to our next meeting or something crazy like that. It is assisting what we're doing as humans.

Alane Boyd (23:54)
Yeah, and if you're at the low percent, meaning that you really haven't been using AI, maybe you've dabbled here and there, maybe you've been using chat, is you can start small by trying to develop a Skill in Claude Cowork. You don't have to go all the way to doing something in n8n where you really need to understand how to build agents. But in Claude Cowork, that is like the really great next step. And using...

try using a Plug-in understand how Skills work, but then develop a Skill and using it and see how it works. Does it give you consistent results? Once you see like, yes, this has giving me the results that I'm trying for, share it with your team. See if they have it. Something that I've been hearing, I think it's because, we've been talking about Claude Cowork so much and companies are just so excited, which obviously we are too. And what they're uncovering.

is how inconsistent things are in the company because each individual person has their way of doing things. Well, you can't create a Skill and push it out to the team or create an agentic workflow and it handle some of the stuff for the team if you don't have a consistent way of doing stuff. So that might even be your pre-step into getting some of this done because you have to have consistency.

Micah Johnson (25:12)
Yeah, this brings to mind an example that I can share here as well, Alane. And that's a relatively newer one, but became so helpful. We built a Skill to help us process our sales pipeline. And at first glance, again, you could think about it like, we should just automate all the follow-ups. How many times have we heard that? Let's just automate all of our sales follow-ups. Here's the pattern that works. Automate the heavy lifting.

have a human use judgment, automate the remainder. And we applied that to our sales follow-up process. And what it's actually doing, it's a Skill in Cowork. It goes to our CRM. It gets a list of the deals that are open. It prioritizes it based on the criteria that we've given it in the Skill. So it knows how to surface things first. And then, for example, if I were to run this, it would give me my top priority.

deal that I need to pay attention to, it gives me one. And it does the cross-reference for me, and it drafts a reply with context of why it gave that reply. might be giving it this podcast episode after we release it and say, hey, here's a relevant podcast episode. It might be giving a use case that we documented. It might be giving a recent news article that came out in one of our AI news briefs. It will ...

consolidate all of that so that all I have, that's the heavy lifting. All I have to do is look at that and go, does this make sense for a follow-up? Send as is, edit or redefine. That's it. And what used to take 15 to 30 minutes per follow-up, which meant probably didn't get done, it's taking one minute because heavy lifting.

Alane Boyd (26:56)
Mm-hmm.

Micah Johnson (26:59)
I can look at it and immediately know that deal, that prospect, the last meeting I had with them and go, yeah, let's change this. Cool, send it. And then it's that loop, it's that iteration. But this is one Cowork Skill that I can share with anybody who's doing pipeline follow-up. And it's all the same thing. Now, Alane you brought up such a great point a second ago, which is standardization. I was able to share that with say you Alane, because

Alane Boyd (27:29)
Mm-hmm.

Micah Johnson (27:29)
We run the deals through the same system in the same way with the same process, et cetera, et cetera. We already had that standardization in place. If you are using a different tool, if you are managing your leads in different ways than I was, and we had another salesperson that was doing it in their own way and they're managing it in a spreadsheet because they hate CRMs and all this crap, and we let that just proliferate across our org, then this Skill would be absolutely useless to anybody else.

Alane Boyd (27:52)
Hmm.

Yeah, absolutely. The other next step that a company can take, like it feels overwhelming when you haven't taken a step. And finding a champion. Micah and I are our own champions in the company because we're decision makers. Sometimes I'm bringing something to him, sometimes he's bringing something to me, but we're decision makers that can take the action. Depending on where you're at in your company, you...

may be too far removed from, like maybe you started this company 30 years ago and trying to learn all this and being the champion of it is not of your, interest, but that doesn't mean you don't want your company to be ⁓ AI forward thinking. So who on your team can be a champion? It doesn't need to go to everybody at once. You can start with one person, let them figure it out, let them know that, I'm supporting you in doing this.

and I'm gonna pay for you to have a pro plan subscription at first, let's get some wins so that you can come back to the team and say, look at how I'm using this, how it saved time. And then maybe you get a partner in crime and then you have two champions at your company. It's two people that work together that can share ideas before it goes to everybody because there is something to having support.

from the rest of the organization to go forward, but they also need to understand the value in it. If you just dump it on people, like, hey, we're gonna do this, and there's not enough support there or understanding of where it can be used in your company, you're gonna be paying for software that people are using lackluster

Micah Johnson (29:25)
And I like to call it a partner in time because that's what you're saving.

Alane Boyd (29:28)
Partner in time.

Well, that's more romantic than crime.

Micah Johnson (29:33)
Hahaha

All right. So easy, really great advice on next steps. Alane, the takeaway from all of this is just start with one, start small, start with one, find your champion, give it a try. I know we've talked about this in other episodes. If you don't pay for it, they're going to pay for it. It's only 20 bucks. They're going to be using it. And what Alane mentioned earlier, I like to phrase as siloed implementation, which is very difficult to either A understand.

Alane Boyd (29:42)
Mm-hmm.

Micah Johnson (30:00)
what's happening in your organization and B unwind all of this. So pay for the $20, get somebody in Cowork, get somebody who's excited, have them listen to this episode, get this pattern where it's let Cowork do or any AI do some of the heavy lifting, let the human guide it, judge it, and then let the automation take it from there. Huge time savings.

huge boons for the company, for the org, and then grow from there. another two guarantees in this episode Alane, but I guarantee once you start going down this path, you'll have plenty of those oh shit moments where you go, if we can do it this way, does that mean we could do this? Or what if we actually automate getting it from the email? Or what if we don't have to manually transform all these spreadsheets and collect from different, like it just starts.

escalating and the ideas start flowing once you start.

Alane Boyd (30:59)
One of the things that I was blown away, we just did our Claude Cowork workshop this week and we identified so many areas with the people that were on there of where they need help. So if you just think, hey, I'm gonna give them the approval for $20 and they're gonna use Claude Cowork, yes, they will. But I promise you that they could use the support of somebody that has been doing this and already knows all the areas that they can get set up. How do you create Skills?

How do you use Connectors? How do you use Plug-ins? How do those three work together and get them going fast? And that is going to save so much heartburn in the long run. We've got another Claude Cowork workshop coming up on May 11th. first one sold out, so we opened up another date. Our AI agent cohort, we start on April 20th, so not enough time now for you to sign up, but we do have another one coming up in June, so stay.

up to date on our cohorts because if you're wanting to automate more with AI agents, that would be the next step after Claude Cowork is going and building in n8n. And Micah, we haven't shared this on the podcast before, but we've been getting rave reviews on your AI News Briefs that you do every single day.

Micah Johnson (32:11)
Yeah, I know. If anybody's listening, just leave me a comment that you actually read it because what's funny is I started this for basically our own benefit going, we need to keep up with this. I put it live on our community, is your.biggestgoal.ai, Y-O-U-R, .biggestgoal.ai. But almost every day that I meet with somebody who's in our community, they're like, man.

love that news brief. Like it's the best one I read and thank you. But you know, it is just, it's really cool. There's so much news. There's so much happening every day that it is super helpful. I'm the one creating it and I'm still finding every day like, man, I think it was yesterday. I'm like, Alane, you got to check out this news story

Alane Boyd (33:03)
You did, but it is really cool and it's free right now to join the community. So if you want to stay up to date on the latest AI news curated by Micah, come and join us in the community.