// transcript — 2347 segments
0:00 Introduction and Overview
0:03 Welcome to the official Saster Podcast where you can hear some of the best SAS
0:07 [music] speakers. This is where the cloud meets up today on [music] the Saster Podcast.
0:13 If you don't know your FTE, if you don't know the answers to these questions and
0:17 you're spending any material amount of money on an agent, you're wasting
0:20 [music] your energy. And so, it will be interesting to see where it goes this
0:24 year. A lot of the agents we use um are pushing down market to be more
0:28 self-service. So far, that doesn't work. So far, I will say for the most part,
0:33 agents that require deep training cannot be self-trained. It it will come. Agents
0:37 are getting so much better. That's frontier one. But wait and see. Be skeptical. If you
0:42 buy a cheap tool that says it's self trained, make sure it works. And you
0:46 know the time. If you buy a more complicated tool like we're talking
0:50 about, just talk with someone senior enough on deployment, not again of
0:53 someone trying to sell you something that doesn't know, and be honest about
0:58 what it's going to take. Um, otherwise it's like going to the doctor and
1:01 getting a prescription for medicine and never taking it. It's not going to work.
1:10 Hey Sasker, imagine having agents for every support tab. [music] One that
1:13 triages tickets, another that catches duplicates, one that spots churn risk.
1:17 That'd be pretty amazing, right? [music] Happy Flash just made it real with
1:21 autopilot. These pre-built AI agents deploy in about 60 seconds and run for
1:26 as low as 2 cents per successful action. All of it sits inside the Happy Fox
1:30 [music] omni channel AI first support stack chatbot co-pilot and autopilot
1:35 working as one. Check them out at Hey everybody, Sastster annual will be
1:44 back May 2026, [music] the world's largest SAS and AI gathering
1:48 for executives. Just this as last May, we hosted 10,000 attendees with 68 VP
1:54 level and above attendees. [music] 36% CEOs and founders and 25% were AI first
1:59 professionals. It's the very best of STER [music] attendees and decision
2:03 makers that come to Saster Annual and AI Summit each and every year. But here's
2:06 the reality, folks. The longer you wait, the higher ticket prices get. They're
2:10 cheap now. They're cheap. So just get them early. Lock in your spot today. Use
2:15 my code Jason 100 for [music] exclusive savings. Get your tickets at podcast.s
2:20 sasterual.com or just use code Jason100 when you check out. See you there.
2:24 Saster Annual and AI Summit 2026. It All right, cool. So, um, yeah, to kick
2:33 things off for today, we wanted to talk a little bit about where we're at today
2:38 with all of our agents and maybe more importantly now that some of you have
2:41 deployed at least one agent and are looking at doing multiple agents kind of
2:45 in the path that we've been, how what does that really take in reality, in
2:50 fruition to do this multi- agent management? What does that all mean?
2:56 So just for quick context, I think a lot of you have seen this now, but just as a
2:59 refresher, we have about, you know, 20 plus agents now. We vibe coded about 12
3:04 apps. They've been used, you know, almost a million times, which is kind of
3:07 crazy. I think we'll probably cross the million mark before the next AI day. Um,
3:11 and so that's a lot of usage, right? And so a lot of the things you'll see us
3:15 talking about today, we also have on s.ai/ agents. So you can always go there at
3:21 any time and see like you know some of either the third party tools or the
3:24 in-house ones that we've built and kind Okay. Um so what have we learned now
3:33 that we're about you know almost depending on how you count either eight
3:38 months or a year into this journey. Honey, I was looking up like when we
3:43 when you joined Deli, Jason, I think it was like December of 24, but I didn't
3:49 get on it until like February, March time frame. So, depending on where you
3:52 count like fully deployed though, I say 8 months was multiple agents in tow. So,
3:58 we started with Deli. If you guys haven't tried it, you can go to
4:00 saffra.com. It's the adjacent bubble there. You can talk to his clones. So,
4:04 we added that and then we quickly learned, you know, it was doing kind of
4:08 like advice and support and we spent a lot of time training it. We've talked
4:11 about it on other pieces of content here, but that was our first foray into
4:16 an agent. And I think support and sale I see now is like the most two common use
4:20 cases of where people deploy their first agent. So, super, you know, super common
4:26 thing to do there. And then, you know, then we add in an outbound AISDR if we
4:30 add in multiple of those. Now that's not necessarily I think a path that everyone
4:35 needs to take. You could probably deploy on outbound AISDR agent and do it well.
4:43 And then secondly I think to you know we also looked across our sales funnel of
4:49 where do we want to be in terms of uh what else we want to deploy and go to
4:51 market. And so we quickly added, you know, multiple outbound AISDRs, an
4:56 inbound AISDR agent, and then multiple agents across go to market and then a
5:01 few custom vibe coded apps that um some of which we're using now internally, but
5:04 mostly our external facing when like you know the pitch track valuation and
5:09 things like that. >> It's a good summary. A lot of folks have
5:16 followed the journey. But um we did we did push the limit here as as some folks
5:20 know after Saster AI annual last year in May. Basically anyone that left our tiny team
5:26 we replaced them with an agent. So we've been on this journey
5:31 we've deployed you can see our whole uh list. We've deployed a bunch of startups
5:35 are also one of the leaders on agent force and Amelia will touch on that.
5:37 We'll keep chatting about that because I think it's just helpful for you guys to
5:42 see these apps in production. Um, and then we'll get into it. And then we've
5:47 also we started off vibe coding for fun, but as Amelia will go when we found we
5:51 really recommend you buy something, don't build the 9010 rule Amelia will
5:55 have here. But the latest thing we've added, which we've talked, is we we
6:00 built our own VP of marketing. >> And we'll talk about why in a minute.
6:04 >> Yeah. Um, so that's yeah, that's kind of been our journey. Again, it's not
6:08 necessarily that everybody needs to be at, you know, be a number of agents. We
6:11 are for a lot of reasons. We try different agents, right? Like part of
6:14 the SAS community and ecosystem is we are trying different agents. We have
6:18 different partners and so we have also a like underlying need to just try a bunch
6:22 of different things. I also am just like wanting to try different agents and see
6:27 what works and what doesn't. And so there's like that inherent but that's
6:31 not necessarily what you need to do for your business. It may not make sense to
6:34 be at the same level and number of agents we are. But okay, so eight months
6:38 later, you know, just kind of like highlight result is, you know, does all
6:43 this work? I think I've seen kind of now that we're, you know, maybe a year,
6:46 eight months into this journey, I'm starting to see a little bit more like
6:50 skepticism, honestly, weirdly on LinkedIn. I don't know what you're
6:54 seeing, Jason, but I see some I see some people becoming a little bit
6:58 disenchanted with AI agent. And I see a little skepticism. So, does
7:01 this work? I think for us, you know, we have now 8 months in 4.8 8 million in
7:06 county and additional an additional pipeline source via agents and I'll talk
7:09 about why it's additional and then about half of that so 2.4 4 million is closed
7:14 one revenue that was you know first touch source from an an agent across go
7:20 to market. So a lot of lot of good things there as like a proof point.
7:27 Additionally it's yeah we've seen some underlying things also improve via the
7:32 agents. So, our deal volume has more than doubled, right? I count a lot of
7:36 that towards and credit a lot of that towards the agents working, you know,
7:41 24/7, 365. They can always answer a question. They can always book a
7:43 meeting. They can always reach back out to you. Now, sometimes, you know, the
7:47 the humans here, like me, David, Jason has to take the meeting. So, um,
7:51 sometimes they're limited by our human capacities, but I I credit that to the
7:56 agents working, you know, year you're around the clock. Now, our win rate has
8:01 nearly doubled. I think that's just in the nature of the agents, you know,
8:05 especially when it comes to folks that are inbounding and also how it's doing
8:08 outbound, just having a lot more context, right? It's a lot better
8:12 context. It's a lot better outreach in some cases, but then essentially for us
8:17 like across me and David specifically in sales, like it helps us with our
8:21 conversations because we already know what this person has said to the agent.
8:25 We can see the exact conversation they're having. and we can see for on
8:28 the website um we can literally see what else our company has been doing with our
8:32 agents and we use that in the mean right so it saves a lot of time it's also a
8:35 lot more qualified and we get on that call and so I think a lot of that has to
8:39 do with some of the nurturing there and I think more importantly too in those
8:44 stats like it did not cannibalize our other inbound revenue sources those have
8:49 just been augmented by our agent and another thing I like to remind folks too
8:54 is like um it's not that we dropped a bunch of stuff either when we deploy
8:58 these agents, right? I think a lot of folks now will be like, well, okay, if I
9:03 if I just get an outbound AI SDR, then maybe I don't need any human SDR. And
9:07 maybe that's true, right? We we between David and I, we do some we do some
9:10 outbound ourselves, so we don't really have a human SDR, per se,
9:15 but I think there's a lot of pieces in which you would you might still need
9:20 both. line cuz we we do personally respond as humans to each message that
9:26 our AI agents produce and so there's a lot of you know time needed there that I
9:31 don't necessarily like the AI autopilot responses I still think it's better to
9:34 come from a person who actually knows the business and I think too in that too
9:40 I think um you know a lot of this has been augmented by our agents so you know
9:45 in helping us book again more meanings helping us understand the leads a little
9:49 bit Again, we did not drop the other things we're doing, right? We still send
9:53 marketing emails. We still do outbound. We still send gifts to people. Like we
9:56 still invite people to come to this house. Like all the things we used to do
9:59 before with the age of like, yeah, we could do them a little better, but we're
10:03 still doing them. And so I think that might be surprising to some folks, but
10:07 just know I don't I don't think it will cannibalize anything if you do it right.
10:10 But I also think it can definitely augment versus, you know, you don't need
10:15 to go after we have to necessarily replace. We've done that and it's
10:18 worked. But you may see that doing a mix Okay. So cool. But you know, here's the
10:30 um here's maybe the honest truth that that you may not see on LinkedIn or X
10:38 and that is that we maintain these apps every day. like literally even this
10:43 morning before getting on AI day as of you know I'm like checking our agents
10:48 and I think the important thing here is like the agents and the humans have to
10:54 rapidly evolve and change constantly like it's such a mind share killer for
10:59 myself for Jason like we're in these agents you know I think 15 to 20 hours a
11:06 week each that's each not like between two of us that's each of us constantly
11:11 like iter operating with our agents constantly seeing, you know, what are
11:14 they outputting, checking the responses, making sure it doesn't hallucinate,
11:18 making sure, you know, it is the it's talking to folks the way we want it to
11:22 talk to people, making sure it's adding value, making sure it's not degrading,
11:25 right? You sometimes you see these agents degrade over time. Um, and so I
11:30 think the important thing here is like it is kind of a real killer. It does
11:33 take a lot of time. I don't think you can replace the time of managing, you
11:37 know, we've just seen the time shift. Like the time we used to spend managing
11:41 slightly more people on our team. We now spend that same amount of time, if not
11:44 more, managing the agents, but it's just a lot different, right? Like there's
11:49 there's no people drama really. But the and the agents just work at a much
11:54 higher capacity and higher scale than a human being that it's hard to eventually
11:59 keep up with them. And I put this here in bold. I you know I've been trying for
12:03 a long time for the last few months to keep up with my agents and then I
12:06 realized it was futile because I can fordo it. Uh but I try and keep up as
12:10 best as I can. And in that what I truly mean is you know anytime we get a
12:14 response we have a system that it'll slack us from like any of our agents. So
12:19 whenever there's an interaction or agent is having a conversation with somebody
12:23 and it we want to reply and you know again we like to reply ourselves it just
12:28 like we do try and respond to those people literally instantaneously if not
12:32 in real time sometimes we are asleep and so we you know we respond to them as
12:36 first thing in the morning but I've I've realized I keep up with my agents
12:41 they're they're smarter than me. >> I'll tell you um just one nuanced
12:45 learning actually from this week. So, I was meeting last night with the CEO of
12:51 a ne a next we're already in the next generation of AI go to market agents um
12:55 that's already got millions of revenue um and is publicly launching in a few
12:58 weeks but they already have millions of revenue and um I asked I've known them
13:02 for a long time but I asked what the secret sauce was and the secret sauce
13:08 was they do everything they do the onboarding they do the tagging they get the first campaigns
13:15 running they do everything and they do it almost to such a fault that
13:18 [clears throat] some of the customers think it's too easy. They don't even
13:21 realize the energy that's going into it if they haven't deployed an agent yet.
13:28 And the the learning from that is just if you haven't deployed many agents or
13:32 any for real, you got to figure you got to have an honest conversation not with
13:35 someone in sales that doesn't know how the product works or use it themselves
13:39 with a forward deploy engineer with a leader and find out what is it going to
13:44 take to be successful upfront. the first 30 14 and 30 days and every day
13:48 thereafter and then you got to do it or it will fail. >> Yep.
13:52 >> And meet with the be the best of them. Um and you know if you don't know your
13:57 FTE, if you don't know the answers to these questions and you're spending any
14:01 material amount of money on an agent, you're wasting your energy. And so it
14:04 will be interesting to see where it goes this year. A lot of the agents we use um
14:09 are pushing down market to be more self-service. So far that doesn't work.
14:13 So far, I will say for the most part, agents that require deep training cannot
14:18 be self-trained. It it will come. Agents are getting so much better. That's front
14:22 tier one. But wait and see. Be skeptical. If you buy a cheap tool that says it's self
14:24 Best Practices for AI Agent Deployment
1:01 Initial Deployment and Early Learnings
1:01 getting a prescription for medicine and never taking it. It's not going to work.
1:10 Hey Sasker, imagine having agents for every support tab. [music] One that
1:13 triages tickets, another that catches duplicates, one that spots churn risk.
1:17 That'd be pretty amazing, right? [music] Happy Flash just made it real with
1:21 autopilot. These pre-built AI agents deploy in about 60 seconds and run for
1:26 as low as 2 cents per successful action. All of it sits inside the Happy Fox
1:30 [music] omni channel AI first support stack chatbot co-pilot and autopilot
1:35 working as one. Check them out at Hey everybody, Sastster annual will be
1:44 back May 2026, [music] the world's largest SAS and AI gathering
1:48 for executives. Just this as last May, we hosted 10,000 attendees with 68 VP
1:54 level and above attendees. [music] 36% CEOs and founders and 25% were AI first
1:57 Scaling with Multiple Agents
1:59 professionals. It's the very best of STER [music] attendees and decision
2:03 makers that come to Saster Annual and AI Summit each and every year. But here's
2:06 the reality, folks. The longer you wait, the higher ticket prices get. They're
2:10 cheap now. They're cheap. So just get them early. Lock in your spot today. Use
2:15 my code Jason 100 for [music] exclusive savings. Get your tickets at podcast.s
2:20 sasterual.com or just use code Jason100 when you check out. See you there.
2:24 Saster Annual and AI Summit 2026. It All right, cool. So, um, yeah, to kick
2:33 things off for today, we wanted to talk a little bit about where we're at today
2:38 with all of our agents and maybe more importantly now that some of you have
2:41 deployed at least one agent and are looking at doing multiple agents kind of
2:45 in the path that we've been, how what does that really take in reality, in
2:50 fruition to do this multi- agent management? What does that all mean?
2:56 So just for quick context, I think a lot of you have seen this now, but just as a
2:59 refresher, we have about, you know, 20 plus agents now. We vibe coded about 12
3:04 apps. They've been used, you know, almost a million times, which is kind of
3:07 crazy. I think we'll probably cross the million mark before the next AI day. Um,
3:11 and so that's a lot of usage, right? And so a lot of the things you'll see us
3:15 talking about today, we also have on s.ai/ agents. So you can always go there at
3:21 any time and see like you know some of either the third party tools or the
3:24 in-house ones that we've built and kind Okay. Um so what have we learned now
3:33 that we're about you know almost depending on how you count either eight
3:38 months or a year into this journey. Honey, I was looking up like when we
3:43 when you joined Deli, Jason, I think it was like December of 24, but I didn't
3:49 get on it until like February, March time frame. So, depending on where you
3:52 count like fully deployed though, I say 8 months was multiple agents in tow. So,
3:58 we started with Deli. If you guys haven't tried it, you can go to
4:00 saffra.com. It's the adjacent bubble there. You can talk to his clones. So,
4:04 we added that and then we quickly learned, you know, it was doing kind of
4:08 like advice and support and we spent a lot of time training it. We've talked
4:11 about it on other pieces of content here, but that was our first foray into
4:16 an agent. And I think support and sale I see now is like the most two common use
4:20 cases of where people deploy their first agent. So, super, you know, super common
4:26 thing to do there. And then, you know, then we add in an outbound AISDR if we
4:30 add in multiple of those. Now that's not necessarily I think a path that everyone
4:35 needs to take. You could probably deploy on outbound AISDR agent and do it well.
4:43 And then secondly I think to you know we also looked across our sales funnel of
4:49 where do we want to be in terms of uh what else we want to deploy and go to
4:51 market. And so we quickly added, you know, multiple outbound AISDRs, an
4:56 inbound AISDR agent, and then multiple agents across go to market and then a
5:01 few custom vibe coded apps that um some of which we're using now internally, but
5:04 mostly our external facing when like you know the pitch track valuation and
5:09 things like that. >> It's a good summary. A lot of folks have
5:16 followed the journey. But um we did we did push the limit here as as some folks
5:20 know after Saster AI annual last year in May. Basically anyone that left our tiny team
5:26 we replaced them with an agent. So we've been on this journey
5:31 we've deployed you can see our whole uh list. We've deployed a bunch of startups
5:35 are also one of the leaders on agent force and Amelia will touch on that.
5:37 We'll keep chatting about that because I think it's just helpful for you guys to
5:42 see these apps in production. Um, and then we'll get into it. And then we've
5:47 also we started off vibe coding for fun, but as Amelia will go when we found we
5:51 really recommend you buy something, don't build the 9010 rule Amelia will
5:55 have here. But the latest thing we've added, which we've talked, is we we
6:00 built our own VP of marketing. >> And we'll talk about why in a minute.
6:04 >> Yeah. Um, so that's yeah, that's kind of been our journey. Again, it's not
6:08 necessarily that everybody needs to be at, you know, be a number of agents. We
6:11 are for a lot of reasons. We try different agents, right? Like part of
6:14 the SAS community and ecosystem is we are trying different agents. We have
6:18 different partners and so we have also a like underlying need to just try a bunch
6:22 of different things. I also am just like wanting to try different agents and see
6:27 what works and what doesn't. And so there's like that inherent but that's
6:31 not necessarily what you need to do for your business. It may not make sense to
6:34 be at the same level and number of agents we are. But okay, so eight months
6:38 later, you know, just kind of like highlight result is, you know, does all
6:43 this work? I think I've seen kind of now that we're, you know, maybe a year,
6:46 eight months into this journey, I'm starting to see a little bit more like
6:50 skepticism, honestly, weirdly on LinkedIn. I don't know what you're
6:54 seeing, Jason, but I see some I see some people becoming a little bit
6:58 disenchanted with AI agent. And I see a little skepticism. So, does
7:01 this work? I think for us, you know, we have now 8 months in 4.8 8 million in
7:06 county and additional an additional pipeline source via agents and I'll talk
7:09 about why it's additional and then about half of that so 2.4 4 million is closed
7:14 one revenue that was you know first touch source from an an agent across go
7:20 to market. So a lot of lot of good things there as like a proof point.
7:27 Additionally it's yeah we've seen some underlying things also improve via the
7:32 agents. So, our deal volume has more than doubled, right? I count a lot of
7:36 that towards and credit a lot of that towards the agents working, you know,
7:41 24/7, 365. They can always answer a question. They can always book a
7:43 meeting. They can always reach back out to you. Now, sometimes, you know, the
7:47 the humans here, like me, David, Jason has to take the meeting. So, um,
7:51 sometimes they're limited by our human capacities, but I I credit that to the
7:56 agents working, you know, year you're around the clock. Now, our win rate has
8:01 nearly doubled. I think that's just in the nature of the agents, you know,
8:05 especially when it comes to folks that are inbounding and also how it's doing
8:08 outbound, just having a lot more context, right? It's a lot better
8:12 context. It's a lot better outreach in some cases, but then essentially for us
8:17 like across me and David specifically in sales, like it helps us with our
8:21 conversations because we already know what this person has said to the agent.
8:25 We can see the exact conversation they're having. and we can see for on
8:28 the website um we can literally see what else our company has been doing with our
8:32 agents and we use that in the mean right so it saves a lot of time it's also a
8:35 lot more qualified and we get on that call and so I think a lot of that has to
8:39 do with some of the nurturing there and I think more importantly too in those
8:44 stats like it did not cannibalize our other inbound revenue sources those have
8:49 just been augmented by our agent and another thing I like to remind folks too
8:54 is like um it's not that we dropped a bunch of stuff either when we deploy
8:58 these agents, right? I think a lot of folks now will be like, well, okay, if I
9:03 if I just get an outbound AI SDR, then maybe I don't need any human SDR. And
9:07 maybe that's true, right? We we between David and I, we do some we do some
9:10 outbound ourselves, so we don't really have a human SDR, per se,
9:15 but I think there's a lot of pieces in which you would you might still need
9:20 both. line cuz we we do personally respond as humans to each message that
9:26 our AI agents produce and so there's a lot of you know time needed there that I
9:31 don't necessarily like the AI autopilot responses I still think it's better to
9:34 come from a person who actually knows the business and I think too in that too
9:40 I think um you know a lot of this has been augmented by our agents so you know
9:45 in helping us book again more meanings helping us understand the leads a little
9:49 bit Again, we did not drop the other things we're doing, right? We still send
9:53 marketing emails. We still do outbound. We still send gifts to people. Like we
9:56 still invite people to come to this house. Like all the things we used to do
9:59 before with the age of like, yeah, we could do them a little better, but we're
10:03 still doing them. And so I think that might be surprising to some folks, but
10:07 just know I don't I don't think it will cannibalize anything if you do it right.
10:10 But I also think it can definitely augment versus, you know, you don't need
10:15 to go after we have to necessarily replace. We've done that and it's
10:18 worked. But you may see that doing a mix Okay. So cool. But you know, here's the
10:30 um here's maybe the honest truth that that you may not see on LinkedIn or X
10:38 and that is that we maintain these apps every day. like literally even this
10:43 morning before getting on AI day as of you know I'm like checking our agents
10:48 and I think the important thing here is like the agents and the humans have to
10:54 rapidly evolve and change constantly like it's such a mind share killer for
10:59 myself for Jason like we're in these agents you know I think 15 to 20 hours a
11:06 week each that's each not like between two of us that's each of us constantly
11:11 like iter operating with our agents constantly seeing, you know, what are
11:14 they outputting, checking the responses, making sure it doesn't hallucinate,
11:18 making sure, you know, it is the it's talking to folks the way we want it to
11:22 talk to people, making sure it's adding value, making sure it's not degrading,
11:25 right? You sometimes you see these agents degrade over time. Um, and so I
11:30 think the important thing here is like it is kind of a real killer. It does
11:33 take a lot of time. I don't think you can replace the time of managing, you
11:37 know, we've just seen the time shift. Like the time we used to spend managing
11:41 slightly more people on our team. We now spend that same amount of time, if not
11:44 more, managing the agents, but it's just a lot different, right? Like there's
11:49 there's no people drama really. But the and the agents just work at a much
11:54 higher capacity and higher scale than a human being that it's hard to eventually
11:59 keep up with them. And I put this here in bold. I you know I've been trying for
12:03 a long time for the last few months to keep up with my agents and then I
12:06 realized it was futile because I can fordo it. Uh but I try and keep up as
12:10 best as I can. And in that what I truly mean is you know anytime we get a
12:14 response we have a system that it'll slack us from like any of our agents. So
12:19 whenever there's an interaction or agent is having a conversation with somebody
12:23 and it we want to reply and you know again we like to reply ourselves it just
12:28 like we do try and respond to those people literally instantaneously if not
12:32 in real time sometimes we are asleep and so we you know we respond to them as
12:36 first thing in the morning but I've I've realized I keep up with my agents
12:41 they're they're smarter than me. >> I'll tell you um just one nuanced
12:45 learning actually from this week. So, I was meeting last night with the CEO of
12:51 a ne a next we're already in the next generation of AI go to market agents um
12:55 that's already got millions of revenue um and is publicly launching in a few
12:58 weeks but they already have millions of revenue and um I asked I've known them
13:02 for a long time but I asked what the secret sauce was and the secret sauce
13:08 was they do everything they do the onboarding they do the tagging they get the first campaigns
13:15 running they do everything and they do it almost to such a fault that
13:18 [clears throat] some of the customers think it's too easy. They don't even
13:21 realize the energy that's going into it if they haven't deployed an agent yet.
13:28 And the the learning from that is just if you haven't deployed many agents or
13:32 any for real, you got to figure you got to have an honest conversation not with
13:35 someone in sales that doesn't know how the product works or use it themselves
13:39 with a forward deploy engineer with a leader and find out what is it going to
13:44 take to be successful upfront. the first 30 14 and 30 days and every day
13:48 thereafter and then you got to do it or it will fail. >> Yep.
13:52 >> And meet with the be the best of them. Um and you know if you don't know your
13:57 FTE, if you don't know the answers to these questions and you're spending any
14:01 material amount of money on an agent, you're wasting your energy. And so it
14:04 will be interesting to see where it goes this year. A lot of the agents we use um
14:09 are pushing down market to be more self-service. So far that doesn't work.
14:13 So far, I will say for the most part, agents that require deep training cannot
14:18 be self-trained. It it will come. Agents are getting so much better. That's front
14:22 tier one. But wait and see. Be skeptical. If you buy a cheap tool that says it's self
14:27 trained, make sure it works and you know the time. If you buy a more complicated
14:32 tool like we're talking about, just talk with someone senior enough on
14:36 deployment, not again someone trying to sell you something that doesn't know,
14:40 and be honest about what it's going to take. Um, otherwise it's like going to
14:43 the doctor and getting a prescription for medicine and never taking it. It's
14:46 not going to work. It's literally like that for an agent, right? Um, but this
14:50 was the first one I saw that could do like the type of stuff Ailia and I are
14:54 talking about, but without you doing any work. But it's because they have a huge
14:57 human team. They call them forward deployed AES in the beginning and then
15:01 other folks take it off. But that's an extreme case. I haven't seen any other
15:04 app like this that can just do it with no with not consistent training and work
15:09 every single day. every single day. [snorts] >> Yeah, we, you know, we do spend a lot of
15:13 time per week actively managing all of our agents. Just I think to Jason's point, be
15:20 prepared, right? It is, again, I think I see too many folks who they see the
15:23 really good stats and they get, you know, even our stats are, you know,
15:26 fairly good and they get a little like mesmerized by them thinking, you know,
15:31 okay, it's AI, I can just prompt it and it'll it'll do it fairly quickly. I'll
15:36 say too, like most of these agents have some sort of prompting in them, but
15:41 they're not necessarily all built via a prompt, right? Like don't think about it
15:44 the same way you think about putting a prompt into OpenAI or Claude. It's not
15:49 going to function the same way. There's a lot of like there is some prompting in
15:53 each of these third party tools, but at the end of the day, you're going to have
15:56 to like figure out what works, put that into, you know, their prompt builder in
16:01 whatever format they have, um, and refine it from there. And then there's
16:03 usually additional steps other than a prompt. So I think some of the tools
16:08 we've seen are kind of getting to that point. Maybe they'll get there like by
16:11 the time we're at SAS annual or maybe right after in like the second half of
16:14 the year. Um but most of them are not just font builder. So I think that's
16:18 another thing to just bear in mind like it's not if you're used to kind of like
16:23 this easy path that Claude and Chat and OpenAI have, you know, kind of
16:28 proprietary put out there like they're not all like that. They're not all that
16:31 easy. Now, I will say you can use things like chat and claude to make your
16:35 prompts better. I do that all the time. Like, I'll put whatever context I'm
16:38 putting into our agents. I'll run it through Claude. I'll run it through chat
16:42 GBT and see what I suggest to make it better. Sometimes I'll take or leave the
16:45 suggestion. Sometimes it's not, you know, the AI doesn't know your business
16:50 the same way you do sometimes. And so, I think that's just another thing to be
16:53 prepared for. So, how did we get to this? I think this will address some of
16:57 the questions as well. Yeah. How did we get to these results of, you know, 4.8
17:00 million in accounting and additional revenue 2.4 before we close one. So
17:03 about half that. And then also, you know, we're we've now crossed the 60K
17:07 mark and how many emails and interactions our AI emails have had just
17:12 in the salesunnel, right? That's not even counting the almost close to a
17:16 million we've had in our proprietary interactions with our vibe coded apps.
17:20 But that's a lot right there. >> You want you're going to explain how
17:24 this looks and given how tiny we are, it's pretty impressive numbers. But if I
17:28 had to summarize all of this and then challenge me if I'm wrong cuz you're
17:31 doing you're doing the real work, right? I think some of the key here is not that
17:36 all these emails were the best emails that have ever been sent in the history
17:39 of mankind. Um I know you think they're great. I actually just think they're
17:42 okay. I don't think they're bad. I think they're better than most of the outbound
17:46 emails I'm going to get during this day during AI day. The ones we send are
17:49 better, but they're not the best I've ever they're not something you can spend
17:53 an hour crafting. I think the number one key and that's why the 60,000 key is
17:59 cool. We are touching folks lapsed folks we forgot to talk to folks we don't talk
18:06 to enough more often we are connecting with more people more often not spam
18:11 right but we couldn't do 60,000 highquality emails manually or even with
18:16 old school outreach tools we just couldn't do it right. So, I think it's I think the key
18:22 to this Tell me if you're wrong and then I'll we'll be quiet. I think it's just
18:26 more high quality pretty good interactions. >> Yep. >> That's the thing. We're getting scale.
18:32 And that's why what I think when I think about everything we've learned and
18:34 you've done most of the work, Amelia, if I had to advise people that are earlier
18:38 on their journey, find something in your go to market motion that just isn't
18:42 getting done or is getting done very at a very mediocre level.
18:45 >> Yep. >> Then put an agent. Don't try to replace
18:50 what's working well. Do that as your 10th agent or your 20th. Like literally,
18:55 we're such a tiny team. We just weren't reaching out to enough people in our
18:59 base, in our activated base. And so that's the lowhanging fruit for us. We
19:03 just could not do sure any you could put something in a dated outreach sales off
19:08 cadence, but that don't work, right? But we never would have done this otherwise.
19:13 So my find that low hanging fruit, the stuff in your dom that you're just not
19:16 getting to. The customers that are too small, the customers that take too long
19:20 to respond and your team doesn't want to do. The customers that have low have,
19:25 you know, everyone's that have lower scores, right? That that but they're
19:29 still they still have intent, but no one wants to call them back. Do those ones
19:33 for whatever your low fruit is because then even if you get some yield is it's
19:38 it's magical. >> Yep. I agree. So I think again a little
19:44 bit of a misconception here related to some of what of the chatter is the
19:50 formula for us is to copy your best human like as you're deploying maybe
19:53 you've already deployed one agent maybe you're deploying your next agent to
19:57 Jason's point right do something that you could either get a lot of scale out
20:03 of by adding an agent and do pretty good there's just some things like the agents
20:08 can't slash shouldn't do like obviously we love agent Like I love our agents. I
20:11 use a lot of them. I add, you know, I just added the AIVP of marketing. We'll
20:15 show you guys. But there are some things I'm like the agent would just suck at
20:18 that. So I'm just not going to do it. It's just like there are some things I
20:22 still need humans to do. Like a lot of the production stuff we're doing for SAS
20:25 annual. I'm like I still need a human to do that. Dude, the Asian is not there
20:30 yet. But what we also mean by copy your human is if you're going to add AI
20:35 agents, add scale to help you scale, right? is on scaling more emails, more
20:40 more meetings, more clicks, more volume. Figure out what works first. I see too
20:45 many people who, you know, okay, they want to automatically give an AI SDR to
20:51 their SDRs. And I'm like, okay, well, are those SDRs new? Did they just join
20:54 one? I don't think it's a good idea to give it to every single SDR. I think
20:58 that's you're gonna there's a lot of reasons that would get you into trouble
21:03 fairly quickly in terms of workflows. But I but I also think if you don't know
21:08 what works first, I don't get this mindset of like, oh, it didn't work, but
21:12 I'm just going to add AI and it'll magically work now. Like, no. If it
21:16 didn't work or wasn't working before AI, it's not going to magically work now.
21:17 Evaluation and Build vs. Buy Strategy
21:19 And somebody asked me this question yesterday when I was doing a Salesforce
21:23 webinar like, okay, what if I'm a super early stage startup and I don't know
21:27 what works? And I was like, well, do you have any customers? They're like, yeah,
21:29 we have like, you know, 10 paying customers. I'm like, "Well, go ask your
21:33 customers why they bought you." Like, everybody has at least a few customers
21:36 or maybe if you're super early stage and you've got a few folks on a trial, just
21:41 go ask them. Like, figure out what worked. Figure out what got them in to
21:44 your product and is getting them hands- on product. Figure out what works first,
21:49 right? We had so much data that we ran through before we put it into any of our
21:53 agents on what was working, right? the best the best a like the best email copy
21:58 for doing outbound the best responses of how we should follow up with inbounds
22:04 you know the best um contacts and verbiage about SAS you know about SAS's
22:09 events about sponsoring SAS like we went through all this data we went through
22:12 all this context we flagged everything that was the best of everything before
22:19 we put it into any agents any AI etc whatever so you know train the agent on
22:24 what works best. I think I see too many people now falling into the trap of they
22:28 want to add AI into something new. And sometimes you can and it it will work to
22:32 some degree, but I think if you do it and you train it on the best of
22:35 everything, it will work that much better, right? It will get you to pretty
22:38 good to Jason's point. Like it'll get you to pretty good emails. They still
22:42 may not be the best on planet Earth, but it'll still you it will I think it'll
22:46 put you over that bar of pretty good versus crappy AI emails that we've all
22:50 seen or even crappy human emails that we've all seen. So yeah, that's that's why I convinced
22:55 on it. But yeah, I think you know, you have to trade it on the best of
22:58 everything. And if you don't know what that is yet, I would take that time,
23:02 take a week, figure that out before you know you deploy your first or next
23:07 agent. So that and then see where that gets you. I feel like you'll have a
23:12 better output because we are constantly iterating our agents now to make sure
23:16 they're they have the best of everything and that they know everything that we
23:22 know like as we know it, right? So like as we get, you know, speakers for SAS or
23:26 we have new things that we're doing or now we've got like lounges and stuff or
23:29 like new things in our sales process like we I'm constantly making sure the
23:33 agent knows all this so that I can talk Okay, so I want to address some of the
23:40 questions on the chat of like you know some of folks are asking about
23:44 evaluation tools like what's our processes and then this is the 9010 rule
23:50 that Jason came up with but I I really do agree with and I think it's a good
23:56 one which is you know buy 90% of your AI stack and I'll talk about the evaluation
24:00 process we've done in a second and only build the 10% where there's where I
24:05 where I think there's you know there's no vendor that can do this Well, and
24:10 it's either a P1 prior priority or as you'll see in like our AIBPM, you know,
24:14 I built that agent cuz it was a commodity like it was something where
24:18 even with all of our agents now, I was like, I still have so much data from
24:24 Saster internally that I want to act on and I want to deploy this agent in a way
24:27 that maybe I don't need it to run everything automatically. So, it's a
24:32 very specific use case, but that was where I kind of built that and that's
24:35 where that kind of fostered in from, right? It's like, okay, I had all this
24:38 data. I wanted to do something that was more internal facing, not necessarily
24:41 external like a lot of our go to market agents are. Um, and so in that case, it
24:46 made sense to build. I'd say for a lot of things, it doesn't make sense to
24:49 build, right? Like, um, if you guys listen to the podcast Kyle and Jason
24:53 did, I think we put it up like last week or something. Kyle, who's the CRO of
24:57 owner, talks about how, you know, he's also kind of roughly followed the 9010
25:01 rule of, you know, he's bought a lot of third party agents. um he's made them
25:05 work and then he hired somebody who was like a I think it was like a former
25:09 founder or something right Jason to like build a proprietary in-house tool and
25:13 that's like one extreme right but like even that 10% that he's building
25:17 in-house like he hired somebody who like was a CEO was an engineer like knew how
25:21 to code that like I think he was like a CEO of like an LLM company or something
25:24 like knew all this crazy stuff and like could build a proprietary like internal
25:29 agent but again for a lot of things it Um, I think too just to address some of
25:38 the questions on the chat of like what what's been our criteria. Um, and we've
25:42 talked about it a little bit before, but when you're evaluating these tools for
25:46 the 90% you want to buy. Um, I think the important thing is to
25:51 one, you know, again, I don't know why I think in the age of AI people sometimes
25:55 will will throw things away because they're like, "Oh, there's this shiny
25:58 new object." I literally asked all of our all of these AI tools that we now use and
26:04 deploy for help. I was like, I need one like one, I'm going to need help. Like
26:08 I'm going to need an FD and two, let me talk to people who have used this. Like
26:12 I think I see too often folks are like, okay, it's an AI tool and so I'm not
26:16 going to ask for a customer reference. Like ask for a customer reference. I do
26:20 these all the time now. Like I try to make them as short as possible now cuz
26:24 um you know we do these webinars and stuff too, but I do these all the time
26:28 now. like Marshall from Mango Mint, Kyle from Owner, like we do this all the
26:31 time. Like Phipe from Persona, we do this all the time now. People ask us
26:34 constantly for like, you know, a customer reference. Like it's con like
26:38 ask them for a customer reference and if you can ask them for one like in your
26:42 vertical, see what they say to you, right? Like if they push back, maybe
26:46 don't use that vendor. Like they Yeah, most of these folks have
26:49 at least one customer that's slightly like yours. If they don't have one in
26:51 your vertical, maybe you can give them a pass on that. like at least talk to a
26:57 customer and then see how much they will help you, right? I think a lot of these
27:00 tools to their credit for the third party tools we do use now have been
27:04 helping us along the way, right? Like there's there's some of this like we've
27:09 learned from just now deploying so many agents, but some of it was because they
27:13 put an FTE on our success team, right? Like Salesforce put an FDA on our
27:18 success team. Artisan is unique in that, you know, anytime I have an issue or or
27:23 I have an idea, I just, you know, the CEO or the head of product, you know,
27:27 Multi-Agent Management Realities
27:28 qualified. There's an FTE on our success team. Oh, you know, Replet, we have an
27:33 FD on our success team. There's just so many cases here where if you ask them
27:37 for that, they should give you some level of that service, right? Like to
27:42 make it work cuz they should want your business and they should want you to be
27:44 successful. Now, it doesn't mean that you need to have an FTE like every week.
27:48 Like now I meet with them a lot less often than getting started, right? But
27:51 you should ask them at the very start at the very least to have some FTE time at
27:59 >> Say one thing on tools. I know this is versions of things we've said since the
28:03 beginning. When you're talking to a vendor, >> if it doesn't feel right, don't buy it.
28:07 >> Yeah, >> it should feel right. It should feel
28:11 a lot of folks flame me a little bit when I say a lot of agents should almost
28:17 get you going for free, right? And a lot of the agents can't do that. There's
28:20 economic reasons. There's headcount limits. People can't really train you
28:23 and deploy you for free. >> But if you look at like um the 20VC that
28:29 I did with Harry and Rory when Mark Beni off came on, it was interesting when he
28:32 said he wished he could. He said he can't at Salesforce, but he wished he
28:36 had enough FDEs that everyone could be in production on Agent Force before they
28:41 had to pay. It's not practical, but the best ones take you as far down that
28:45 journey in the Age of AI as they can. The pro they're proud of their products.
28:48 They'll show it to you. If something doesn't smell right, if it doesn't feel
28:51 right, if you don't think it's going to work, it won't work. Buy another one.
28:56 Even if the brand's less good, even if it's scrappier, even if whatever, if it
29:01 smell, if it doesn't, if your Spidey scent says this agent isn't going to
29:03 work, don't buy it. [snorts] >> Yep. I agree. I think too, like um yeah,
29:09 that's a that's a the point you made on the free trial that like a lot of agents
29:13 cannot set you up for free. That's a really good point in the evaluation. So,
29:18 yeah, we you know, we threw down for these agents. >> It makes it hard. It makes it hard to
29:22 take some risk. It is interesting. I want to say it is interesting that when you look at the
29:28 proumer AI tools that we highlight, all all all the Reeves and the gamas,
29:32 they're lucky because you can get so much value for free. Even forget about
29:36 29 bucks a month or 99 bucks, actually the free products are great. Like try
29:39 those tools. The problem with AIGM tools is like even if they want to do it, they can't do it,
29:45 right? So, you've got to take some risk, but um maybe not later in the year,
29:52 but you know, don't do it if it doesn't feel right. [snorts]
29:55 >> All right, that's our kind of like build versus buy rule. And then once you get
29:59 to this point of the process, like um something I wanted to address, which is
30:00 Managing Multi-Agent Systems
30:03 also the title of this our talk today was what does that look like in reality
30:07 once you get into multiple agents, right? And I'm going to say something
30:12 today that it's not so simple. Don't let that scare you. Don't let that like
30:15 frighten you off of doing more than one agent and maybe you stick to one and it
30:18 works really well and that's fine. Like you do not everybody needs to be I think
30:22 on a [clears throat] multi- aent management journey but just know that if
30:26 you are you if you're in that journey today for ourselves and what I've heard
30:32 from some others is it's kind of all bandated together. [laughter]
30:36 There's like a you know there's kind of a big reason folks like you know
30:39 Salesforce are having like a big renaissance like because a lot of these
30:43 thirdparty tools we use for instance and for example like push back to Salesforce
30:46 or we push all the data back to Salesforce with like a zap or you know
30:50 whatever or some of them have a native that they can push records and update
30:55 records back to Salesforce and so a lot of the time right now it looks like you
30:59 know all of our third party tools whether we're a APIing into them or not
31:03 or using things like a Zap year then we have all of our internal data our VIP
31:07 ABS, right? You're we're pushing all that back into things like, you know,
31:12 cloud, Zapier, back into things like Salesforce as our like system of record
31:17 just to keep all the records up to date somewhere central. But, you know, that's
31:21 Choosing the Right Tools for Your Agents
31:23 not native now, right, for now. That's not native at this moment. And so, it
31:28 takes a lot of web hooks. If you haven't heard this word, you'll probably learn
31:33 it fast. We have so many web hooks into our Zap year account. I can't even like
31:36 count them, right? Like we have so many web hooks just firing all the time to
31:40 like push things back, but I'm pushing them again as into one kind of thing.
31:45 And for now, that's like Salesforce cuz it's it can ingest all this data and
31:49 take all the context for our agents. Um, but and not to say like you could say,
31:53 okay, I don't maybe need that data everywhere all at once. Um, but I I like
31:58 to have it. I like to, you know, I like to build the context of the agents from
32:03 one agent to another. Um, and to sort of let it build on itself, we use a lot of
32:07 web hooks. You know, we use Zappier. I know N is having like a renaissance now
32:11 cuz it's kind of the same thing, but just built in the age of AI. Um, but
32:16 whatever one you use, right, you're going to you're going to see quickly.
32:17 Practical Example: Zapier and Salesforce Integration
32:19 And I've got a screenshot of it. You end up with a lot of like different hooks
32:23 and kind of like hodge podgeing things together. Um, but I think it's just for
32:27 now, right? I don't think that's a problem for always. I think it's just a
32:32 problem for now, you know, in the first half of 2026 to to, you know, have it kind of web
32:39 hooked into things that you need to make sure you can control the flow of what
32:43 your agents are are doing and where that data is ultimately pushing back to and
32:47 pulling from. I do think you should pick one source of truth, right, at the end
32:52 of the day to store some of this and then build further context for your
32:55 agents. you know, we put Salesforce, you could pick up spot or something else.
32:59 I think also to get used to your agents talking to each other on their own, you
33:05 know, it happens. Our agents talk to one another. It's fine. Get used to also as
33:11 a human like talking to your agents. Um, it is kind of a weird thing to at first
33:14 get used to and then you'll get used to it. And then also get used to, you know,
33:19 for now copy past basing context. Like we do a lot of context sharing between
33:21 our agents. like, yeah, some of this pushes to Salesforce. But sometimes I'm
33:25 like, you know what? I don't want it to push through that flow. I'm just going
33:27 to copy paste something in this context from one agent and then put in the other
33:30 agent like the way that it it understands context. And so, again,
33:35 that's not necessarily the simplest or the cleanest path um of multi- aent
33:40 management. And so, I just wanted to be for real about that, that in today's
33:44 world, that's what our reality looks like. But that's also because,
33:49 you know, we use a lot of specialized tools. Like there are obviously I know
33:52 there's like all-in-one agent builders out there. Some of them are some of them
33:57 are coming to us or dismay. But for us like you know I like to use the
34:01 specialized tools. I just still find that the output is a little bit better.
34:05 Like I like to I like to use the best of everything in each agent versus like an
34:09 all-in-one tool um that can build multiple agents. I just for us it works
34:14 better for you. you might see success in using an all-in-one tool, but for you
34:18 know that could build different agents across the board. But for us, since we
34:22 use like very specialized third party agents, this is like the reality we live
34:26 in. But you might not live in it if you pick one system that can do multiple
34:30 agents, you might just have to manage one from there. If you're like, okay,
34:33 you know, I'll trade off maybe some of the quality for quality of life and
34:39 managing all the agents, then then it [snorts] All right. So, what do I mean
34:47 by this? In reality, right, this is a screenshot of one of one.
34:54 This is a screenshot of one of my zaps. I'll explain to you what's slightly
34:58 happening here because this is a good I also wanted to show people like a go to
35:02 market flow they could copy um maybe not necessarily at the same like degree or
35:05 scale but this is one you could feasibly copy slash iterate on for yourselves
35:09 right once you get to multiple agents so you'll see it's catching a web hook
35:15 I think this web hook is s annual if I remember which one I screenshotted um I think this one is
35:22 annual it's catching a web hook because there's like a you know there's a lot of
35:26 forms on our website. Um, and we vibe coded the website. And so it's got a web
35:31 hook when you fill out the form. Um, and so anyways, it's catching this.
35:35 Basically, a web hook is a listening tool. If you don't know what a web hook,
35:39 like it's listening to say, okay, um, in this case, when you submit a form, the
35:44 web hook is going to catch it anytime it has a submission and then tell me what
35:47 to do with that hook, right? So, it's basically capturing that data. So, it's
35:52 catching the hook. It's porting that submission one to a Google sheet cuz I'm
35:56 crazy and I just like backups of everything also in Google Sheets. Like
35:59 again, you'll see like you literally seen in this flow it's going to
36:02 Salesforce, but I also just Yeah, just sometimes I need a quick Google sheet.
36:07 Sometimes it's just nice. So, it's pushing to Google Sheets. You'll see
36:11 it's pushing to Salesforce. So, um you could do this on contact or lead. Um,
36:15 you know, it also depends on how like we're in the flow where we have agent
36:20 force and so, um, it's it's ours is triggered off contacts. You can trigger
36:23 yours off leads. Ours is triggered off contacts and so it's creating, you know,
36:28 a contact in Salesforce. It's adding a contact to a campaign. Now, in number
36:32 four, I circled it because I said, you know, we can pick when it adds a contact
36:36 to campaign if we want to send it to agent force already in this app, right?
36:39 cuz I have certain campaign triggers that say, "Okay, when they're added to
36:43 this campaign, trigger the agent to turn on." So again, you don't necessarily
36:46 need to do that if you're not ready for that yet, but it's something you could
36:50 do here feasibly, easily um and do it a little bit more automated, right?
36:55 Then you know, it's going to um find those records of those find those
36:59 records of, you know, the company. Um there's a little this is a little
37:03 misleading because it sounds simple, but it's finding the company records, right?
37:07 So since this is a contact level contact that it's created um and triggering to
37:11 agent force potentially now it's going to find records of okay basically I'm
37:16 asking Salesforce to see what is this company on the account level because we
37:21 use account level contact uh records what has this company done with us and
37:24 so I want it to find those records of what that company has done with us and
37:27 then you know I want it to get the record attachments if you use clay you
37:33 can use it here in a very kind of fun way to say, okay, if I already have a
37:38 table in clay, you can have it like summarized for you and then also like
37:42 look at LinkedIn and say, okay, what else is this person actually also doing
37:46 on LinkedIn? What are they doing? What are they posting on social media, for
37:49 example? So again, you can get more context. You could skip this step if
37:56 you're like not into using a clay table, but that's a fun way you could do it
37:59 there. And then you can send a Slack channel message to send you all this
38:03 send you all this context of like, okay, here's the, you know, here's the contact that I just
38:09 added to the campaign. Here's the account information about it. Here's
38:12 Deep Dive: AI SDR Tips and Tricks
38:14 the, you know, clay contact about it. And then I'll send you a Slack about it.
38:18 And then if you really want to, you could do things like make a gamma. Like
38:22 if you wanted to make either a landing page or a presentation for this person
38:26 to send in their email about, you know, let's say how to use Gamma at Saster or
38:31 whatever, like how to use whatever your company is for Saster. You could do a
38:36 super complex flow like that. Have it make you a draft presentation or landing
38:41 page to send to you. And then, you know, in Gmail, you could create a draft
38:44 ultimately send to this person if you want to do it that way. Again, this is
38:47 just a sample go to market flow. You can see I didn't like fully set up my clay table because
38:53 I'm just speeding through this. But again, this is a good sample go to
38:57 market flow. You'll see it's like, you know, it's got agents kind of layered in
39:01 it. There's like an agent force layer in it. There's a, you know, if you consider
39:04 a clay agent, there's a clay agent in there. You know, this one pushes to
39:09 Gmail, but if you have an AISDR email platform, you might want it to
39:12 push to that platform. But, you know, all that I think is just
39:18 important to see as an example. >> in this multi- agent management sample
39:25 flow. Right? Again, this is just a sample flow of how you can feasibly kind
39:31 of manage agents, which right now for us is somewhat messy, but it looks a lot
39:35 like these Zapier flows. It's a lot of Zapier to Salesforce to other things to
39:42 APIs to whatever. And so yours may or may not look like this. I think a lot of
39:45 times folks will be like, "Oh, you guys have 20 agents. Like, who are you using
39:49 as your MCP?" I'm like, "We don't have one." Like, we don't have a true like I
39:53 don't consider this Zapier or Salesforce thing a real MCP. I consider it a MCP light,
40:02 but like it's not like if you truly look up what an MCP is, it's like it's not a
40:07 true MCP. like yes like the context is sharing back and forth and you can kind
40:11 of get there on Zapier and Salesforce but I again I call it light MCP in air
40:18 quotes cuz it's not really an MCB and so many people have been asking me that
40:21 lately cuz they've seen you know all of our content or our agents are like yeah
40:24 you know what do you recommend I use for my MCP I'm like I'm I'm not using one
40:31 truly like this is my MCP it's a lot of human work so [laughter]
40:37 again this may not be your use case, but this is how we've done it. Okay. Uh I
40:42 just want to deep dive into two quick things because I feel like there are um
40:47 a few related questions to it. So I have a few deep dive slides on the AISDR and
40:52 then a few deep dives on our AI VPM that I'll just quickly touch on and then if
40:55 you guys like this content I can go fully I don't know I could do more
41:00 questions at another time um on another Wednesday that's not AI day. But yeah,
41:05 on a quick deep dive, I think um things to keep in mind if you're because a lot
41:09 of you in the chat seem to be rolling out like your first AISDR. Now, I think
41:15 a few tips and tricks just agnostic of any tool that you use. I feel like this
41:20 is good um hopefully good advice across the board regardless of what tool you're
41:24 using, which is one to treat each outbound segment dynamically. And what I
41:28 mean by that is like even across our you know multiple agents we have for AI go
41:33 to market. I don't do like I see people do one campaign for like 10,000 leads.
41:40 I'm like no I max my campaign to like 100 500. Like I want each campaign each
41:47 sub agent to be highly customized highly trained to the exact segment that it's
41:52 going after. not like a broad hey have you heard about disaster like no I want
41:56 to say okay these are my outbound segments I put a chart here on the right
42:01 that I made um for our outbound AISDR funnel hopefully it's helpful but I
42:05 treat each of these dynamically and I train each sub agent dynamically on each
42:10 of these things so that the output to Jason's earlier point is pretty good
42:14 right okay maybe it's not great but at least it's pretty good because
42:18 everything is tailored the audience is hyper segmented the messaging is hyper
42:22 segmented. The training is hyper segmented. Hyper segmentation in the age
42:27 of AI with these agents is your friend. Don't go don't spray and pray, please.
42:31 Like don't do that with your agents. I see a lot of people do that. It's that's
42:36 how you get the bad emails off the AISRs. You know, another another way to
42:41 think about it too is is to not think about it in the human ways of
42:45 segmentation, right? Like a lot of times um classic outbound would be, okay, I'm
42:48 going to do it on the geo of where they're based. I'm going to do it on
42:50 their title. I'm gonna do it on their roll. You can see on my chart, none of
42:55 that exists here. Like I'm not doing any of that super highlevel almost
43:00 artificial segmenting. We do it hyper segmented. And the reason we do this is
43:05 to as I bolded it here, give your agent context. Right? If you're already used
43:10 to using chat and claude, what you're doing with those agents every day is
43:14 talking to it, giving it context, telling it about your business. That's
43:18 the same thing you have to do for these AI go-to market SDR agents. You have to
43:23 give your agent context and the more context you give it, the better the
43:29 result will be. And so that's why I hyper segment everything list,
43:34 messaging, targeting, etc. All hyper segmented to the AI STRs. And that's
43:40 across all of our AIDR agents, right? You have to give your agent context for
43:45 it to understand who are you trying to reach out to. What are their specific
43:49 pain points that your problem and your tool can solve? And then I'm going to
43:53 use, you know, my classic AI of I can scrape the internet, see what their
43:57 company is doing, and relate it back to them. And so you'll see in my outbound
44:03 AI SDR funnel, none of this is like cold leads and none of it is like geo or
44:08 title or location. And I think too this good lit I don't know how long this list is. 12 things
44:15 like for most of you start here. Like start here with your AI SDRs. Too many
44:21 folks I see now are doing AISDRs. I'm just going to let it loose on cold
44:24 outbound because that's what our human SDRs don't want to do. I understand your
44:27 human SDRs don't want to do cold outbound to people who don't owe you,
44:31 but neither does your AI agent because your AI agent does not have context for
44:35 why you should be reaching out to this person. So, same rules apply here in
44:42 outbound AI SDRs, you know, start with start with the hot people, the people on
44:45 your website. Um, a lot of these AI agent tools can deanmize some of your
44:49 website traffic to email them, people who have inbounded to you. If you have
44:53 like abandoned carts or trials or you have event leads, start with all the hot
44:57 people. Do the people who like, you know, was a customer, maybe they changed
45:01 jobs. Do current customers like we get this all the time. I'm like I email
45:04 people who like bought a ticket for lemon to come to Sster annual in May in
45:08 NASA and I email sponsors that are like current customers to be like hey we
45:14 added a bunch of new stuff. I think too many folks kind of skip using AI for
45:19 expansion but it's a great way to do it. You know if you have recent marketing
45:22 leads because you're doing something like a webinar like this or you've
45:25 gotten ebooks or gated content or you spent some money on some sponsored media
45:29 and you got some leads. Put those people onto the agent. leads we never followed
45:33 up with that we famously gave to agent force. Again, the list goes on. You
45:36 could see what I mean hopefully here. Like, there's so many hyper segments you
45:41 can give your agent before you give it a quote unquote like cold lead that knows
45:44 nothing about you that you should start here. And a lot of the reasons why you
45:49 should start here is not only will it give your agent context, it will give
45:52 your human team context on what works and what doesn't. So that by the time
45:56 maybe you exhaust this list, I still haven't exhausted this list after 8
45:59 months, but maybe you start to dwindle down this list because you don't have as
46:04 many contacts. Then you can start to do, you know, okay, the then you can start
46:09 to do the AI truly cold outbound to folks who maybe don't know you. But at
46:12 that point, you're using what worked. Again, this all goes back to what works,
46:16 right? Like at that point, you you know how to train your AI agent. You kind of
46:20 know what's worked for these audiences. then you can you can make a very
46:23 informed guess on what would work for a All right, hopefully that's helpful. I
46:30 think the other quick thing just across the board and then I'll go into our um
46:36 AIVPM that we built and try and do some quick questions is um you know AI is
46:41 great because it can adjust then everything right we have it you again
46:46 ingest the best of everything your best case studies your best everything right
46:53 but also tell it what you can't do and I think this is a super important nuance
46:56 that I've only learned after 8 months now I used to just be like okay here's
47:01 the best of everything. Super good to stay in these stay in these boundaries.
47:07 And then over time, because AI is the agents are so self-gratifying, it's
47:09 trying to beat itself, right? It's like, "Hey, Ameilia, I did pretty good." And
47:13 so now I'm going to start to maybe either make stuff up or try and beat
47:19 myself with my opens, clicks, meetings, and I'm going to start to say things
47:24 that maybe you didn't put into the context of the agent. And so I quickly
47:27 learned a couple months in actually that once you start to do this at scale, it's
47:35 maybe just as important to tell your AI agents what you can't do and what you
47:41 can't do. Like I have now told it, you know, okay, we don't do that or we don't
47:45 do this or we don't do that. Uh you know, we don't don't offer people like a
47:49 speaking slot. Like yeah, we have speakers at SAS, but a lot of people
47:54 apply to speak. Send them to, you know, the the content committee submission for
47:59 like do that instead. So I think that's just an important nuance
48:02 I've learned over time. So hopefully that's all for for you guys to know now.
48:07 Hopefully earlier in your journey that I kind of learned it the hard way cuz it
48:11 sent some it sent some emails it shouldn't have of things that like we didn't do and I
48:17 realized it was because I didn't tell it that we couldn't do those things, right?
48:20 Like it was just ambitious like in the way that maybe a human SDR would be like
48:24 oh I don't know I think we could do that or oh I think that's on the road map
48:29 classic, right? And so the AI agent did a little bit of that. And so I think
48:33 it's important to now to say, okay, here's what we can do. Here's what we
48:40 [snorts] Okay. Uh this is a little context, but I want to go through I'm going to upload
48:47 these slides for everyone. So just ask.com. So don't sweat it. I'll also
48:51 send it to you. We still reply to everything. Um maybe just the last uh
48:57 tidbit on AISR agents is you know if you have found bad foundations and what I
49:02 mean by that is bad context that's where you'll see bad emails right bad context
49:07 equals bad emails honestly this bad email I put on here I actually think a
49:10 human wrote to be honest it was written in a way that I actually don't think a
49:14 Building a Custom AI VPM
4:08 Performance Metrics and Results
4:08 like advice and support and we spent a lot of time training it. We've talked
4:11 about it on other pieces of content here, but that was our first foray into
4:16 an agent. And I think support and sale I see now is like the most two common use
4:20 cases of where people deploy their first agent. So, super, you know, super common
4:26 thing to do there. And then, you know, then we add in an outbound AISDR if we
4:30 add in multiple of those. Now that's not necessarily I think a path that everyone
4:35 needs to take. You could probably deploy on outbound AISDR agent and do it well.
4:43 And then secondly I think to you know we also looked across our sales funnel of
4:49 where do we want to be in terms of uh what else we want to deploy and go to
4:51 market. And so we quickly added, you know, multiple outbound AISDRs, an
4:56 inbound AISDR agent, and then multiple agents across go to market and then a
5:01 few custom vibe coded apps that um some of which we're using now internally, but
5:04 mostly our external facing when like you know the pitch track valuation and
5:09 things like that. >> It's a good summary. A lot of folks have
5:16 followed the journey. But um we did we did push the limit here as as some folks
5:20 know after Saster AI annual last year in May. Basically anyone that left our tiny team
5:26 we replaced them with an agent. So we've been on this journey
5:31 we've deployed you can see our whole uh list. We've deployed a bunch of startups
5:35 are also one of the leaders on agent force and Amelia will touch on that.
5:37 We'll keep chatting about that because I think it's just helpful for you guys to
5:42 see these apps in production. Um, and then we'll get into it. And then we've
5:47 also we started off vibe coding for fun, but as Amelia will go when we found we
5:51 really recommend you buy something, don't build the 9010 rule Amelia will
5:55 have here. But the latest thing we've added, which we've talked, is we we
6:00 built our own VP of marketing. >> And we'll talk about why in a minute.
6:04 >> Yeah. Um, so that's yeah, that's kind of been our journey. Again, it's not
6:08 necessarily that everybody needs to be at, you know, be a number of agents. We
6:11 are for a lot of reasons. We try different agents, right? Like part of
6:14 the SAS community and ecosystem is we are trying different agents. We have
6:18 different partners and so we have also a like underlying need to just try a bunch
6:22 of different things. I also am just like wanting to try different agents and see
6:27 what works and what doesn't. And so there's like that inherent but that's
6:31 not necessarily what you need to do for your business. It may not make sense to
6:34 be at the same level and number of agents we are. But okay, so eight months
6:38 later, you know, just kind of like highlight result is, you know, does all
6:43 this work? I think I've seen kind of now that we're, you know, maybe a year,
6:46 eight months into this journey, I'm starting to see a little bit more like
6:50 skepticism, honestly, weirdly on LinkedIn. I don't know what you're
6:54 seeing, Jason, but I see some I see some people becoming a little bit
6:58 disenchanted with AI agent. And I see a little skepticism. So, does
7:01 this work? I think for us, you know, we have now 8 months in 4.8 8 million in
7:06 county and additional an additional pipeline source via agents and I'll talk
7:09 about why it's additional and then about half of that so 2.4 4 million is closed
7:14 one revenue that was you know first touch source from an an agent across go
7:20 to market. So a lot of lot of good things there as like a proof point.
7:27 Additionally it's yeah we've seen some underlying things also improve via the
7:32 agents. So, our deal volume has more than doubled, right? I count a lot of
7:36 that towards and credit a lot of that towards the agents working, you know,
7:41 24/7, 365. They can always answer a question. They can always book a
7:43 meeting. They can always reach back out to you. Now, sometimes, you know, the
7:47 the humans here, like me, David, Jason has to take the meeting. So, um,
7:51 sometimes they're limited by our human capacities, but I I credit that to the
7:56 agents working, you know, year you're around the clock. Now, our win rate has
8:01 nearly doubled. I think that's just in the nature of the agents, you know,
8:05 especially when it comes to folks that are inbounding and also how it's doing
8:08 outbound, just having a lot more context, right? It's a lot better
8:12 context. It's a lot better outreach in some cases, but then essentially for us
8:17 like across me and David specifically in sales, like it helps us with our
8:21 conversations because we already know what this person has said to the agent.
8:25 We can see the exact conversation they're having. and we can see for on
8:28 the website um we can literally see what else our company has been doing with our
8:32 agents and we use that in the mean right so it saves a lot of time it's also a
8:35 lot more qualified and we get on that call and so I think a lot of that has to
8:39 do with some of the nurturing there and I think more importantly too in those
8:44 stats like it did not cannibalize our other inbound revenue sources those have
8:49 just been augmented by our agent and another thing I like to remind folks too
8:54 is like um it's not that we dropped a bunch of stuff either when we deploy
8:58 these agents, right? I think a lot of folks now will be like, well, okay, if I
9:03 if I just get an outbound AI SDR, then maybe I don't need any human SDR. And
9:07 maybe that's true, right? We we between David and I, we do some we do some
9:10 outbound ourselves, so we don't really have a human SDR, per se,
9:15 but I think there's a lot of pieces in which you would you might still need
9:20 both. line cuz we we do personally respond as humans to each message that
9:26 our AI agents produce and so there's a lot of you know time needed there that I
9:31 don't necessarily like the AI autopilot responses I still think it's better to
9:34 come from a person who actually knows the business and I think too in that too
9:40 I think um you know a lot of this has been augmented by our agents so you know
9:45 in helping us book again more meanings helping us understand the leads a little
9:49 bit Again, we did not drop the other things we're doing, right? We still send
9:53 marketing emails. We still do outbound. We still send gifts to people. Like we
9:56 still invite people to come to this house. Like all the things we used to do
9:59 before with the age of like, yeah, we could do them a little better, but we're
10:03 still doing them. And so I think that might be surprising to some folks, but
10:07 just know I don't I don't think it will cannibalize anything if you do it right.
10:10 But I also think it can definitely augment versus, you know, you don't need
10:15 to go after we have to necessarily replace. We've done that and it's
10:18 worked. But you may see that doing a mix Okay. So cool. But you know, here's the
10:30 um here's maybe the honest truth that that you may not see on LinkedIn or X
10:38 and that is that we maintain these apps every day. like literally even this
10:43 morning before getting on AI day as of you know I'm like checking our agents
10:48 and I think the important thing here is like the agents and the humans have to
10:54 rapidly evolve and change constantly like it's such a mind share killer for
10:59 myself for Jason like we're in these agents you know I think 15 to 20 hours a
11:06 week each that's each not like between two of us that's each of us constantly
11:11 like iter operating with our agents constantly seeing, you know, what are
11:14 they outputting, checking the responses, making sure it doesn't hallucinate,
11:18 making sure, you know, it is the it's talking to folks the way we want it to
11:22 talk to people, making sure it's adding value, making sure it's not degrading,
11:25 right? You sometimes you see these agents degrade over time. Um, and so I
11:30 think the important thing here is like it is kind of a real killer. It does
11:33 take a lot of time. I don't think you can replace the time of managing, you
11:37 know, we've just seen the time shift. Like the time we used to spend managing
11:41 slightly more people on our team. We now spend that same amount of time, if not
11:44 more, managing the agents, but it's just a lot different, right? Like there's
11:49 there's no people drama really. But the and the agents just work at a much
11:54 higher capacity and higher scale than a human being that it's hard to eventually
11:59 keep up with them. And I put this here in bold. I you know I've been trying for
12:03 a long time for the last few months to keep up with my agents and then I
12:06 realized it was futile because I can fordo it. Uh but I try and keep up as
12:10 best as I can. And in that what I truly mean is you know anytime we get a
12:14 response we have a system that it'll slack us from like any of our agents. So
12:19 whenever there's an interaction or agent is having a conversation with somebody
12:23 and it we want to reply and you know again we like to reply ourselves it just
12:28 like we do try and respond to those people literally instantaneously if not
12:32 in real time sometimes we are asleep and so we you know we respond to them as
12:36 first thing in the morning but I've I've realized I keep up with my agents
12:41 they're they're smarter than me. >> I'll tell you um just one nuanced
12:45 learning actually from this week. So, I was meeting last night with the CEO of
12:51 a ne a next we're already in the next generation of AI go to market agents um
12:55 that's already got millions of revenue um and is publicly launching in a few
12:58 weeks but they already have millions of revenue and um I asked I've known them
13:02 for a long time but I asked what the secret sauce was and the secret sauce
13:08 was they do everything they do the onboarding they do the tagging they get the first campaigns
13:15 running they do everything and they do it almost to such a fault that
13:18 [clears throat] some of the customers think it's too easy. They don't even
13:21 realize the energy that's going into it if they haven't deployed an agent yet.
13:28 And the the learning from that is just if you haven't deployed many agents or
13:32 any for real, you got to figure you got to have an honest conversation not with
13:35 someone in sales that doesn't know how the product works or use it themselves
13:39 with a forward deploy engineer with a leader and find out what is it going to
13:44 take to be successful upfront. the first 30 14 and 30 days and every day
13:48 thereafter and then you got to do it or it will fail. >> Yep.
13:52 >> And meet with the be the best of them. Um and you know if you don't know your
13:57 FTE, if you don't know the answers to these questions and you're spending any
14:01 material amount of money on an agent, you're wasting your energy. And so it
14:04 will be interesting to see where it goes this year. A lot of the agents we use um
14:09 are pushing down market to be more self-service. So far that doesn't work.
14:13 So far, I will say for the most part, agents that require deep training cannot
14:18 be self-trained. It it will come. Agents are getting so much better. That's front
14:22 tier one. But wait and see. Be skeptical. If you buy a cheap tool that says it's self
14:27 trained, make sure it works and you know the time. If you buy a more complicated
14:32 tool like we're talking about, just talk with someone senior enough on
14:36 deployment, not again someone trying to sell you something that doesn't know,
14:40 and be honest about what it's going to take. Um, otherwise it's like going to
14:43 the doctor and getting a prescription for medicine and never taking it. It's
14:46 not going to work. It's literally like that for an agent, right? Um, but this
14:50 was the first one I saw that could do like the type of stuff Ailia and I are
14:54 talking about, but without you doing any work. But it's because they have a huge
14:57 human team. They call them forward deployed AES in the beginning and then
15:01 other folks take it off. But that's an extreme case. I haven't seen any other
15:04 app like this that can just do it with no with not consistent training and work
15:09 every single day. every single day. [snorts] >> Yeah, we, you know, we do spend a lot of
15:13 time per week actively managing all of our agents. Just I think to Jason's point, be
15:20 prepared, right? It is, again, I think I see too many folks who they see the
15:23 really good stats and they get, you know, even our stats are, you know,
15:26 fairly good and they get a little like mesmerized by them thinking, you know,
15:31 okay, it's AI, I can just prompt it and it'll it'll do it fairly quickly. I'll
15:36 say too, like most of these agents have some sort of prompting in them, but
15:41 they're not necessarily all built via a prompt, right? Like don't think about it
15:44 the same way you think about putting a prompt into OpenAI or Claude. It's not
15:49 going to function the same way. There's a lot of like there is some prompting in
15:53 each of these third party tools, but at the end of the day, you're going to have
15:56 to like figure out what works, put that into, you know, their prompt builder in
16:01 whatever format they have, um, and refine it from there. And then there's
16:03 usually additional steps other than a prompt. So I think some of the tools
16:08 we've seen are kind of getting to that point. Maybe they'll get there like by
16:11 the time we're at SAS annual or maybe right after in like the second half of
16:14 the year. Um but most of them are not just font builder. So I think that's
16:18 another thing to just bear in mind like it's not if you're used to kind of like
16:23 this easy path that Claude and Chat and OpenAI have, you know, kind of
16:28 proprietary put out there like they're not all like that. They're not all that
16:31 easy. Now, I will say you can use things like chat and claude to make your
16:35 prompts better. I do that all the time. Like, I'll put whatever context I'm
16:38 putting into our agents. I'll run it through Claude. I'll run it through chat
16:42 GBT and see what I suggest to make it better. Sometimes I'll take or leave the
16:45 suggestion. Sometimes it's not, you know, the AI doesn't know your business
16:50 the same way you do sometimes. And so, I think that's just another thing to be
16:53 prepared for. So, how did we get to this? I think this will address some of
16:57 the questions as well. Yeah. How did we get to these results of, you know, 4.8
17:00 million in accounting and additional revenue 2.4 before we close one. So
17:03 about half that. And then also, you know, we're we've now crossed the 60K
17:07 mark and how many emails and interactions our AI emails have had just
17:12 in the salesunnel, right? That's not even counting the almost close to a
17:16 million we've had in our proprietary interactions with our vibe coded apps.
17:20 But that's a lot right there. >> You want you're going to explain how
17:24 this looks and given how tiny we are, it's pretty impressive numbers. But if I
17:28 had to summarize all of this and then challenge me if I'm wrong cuz you're
17:31 doing you're doing the real work, right? I think some of the key here is not that
17:36 all these emails were the best emails that have ever been sent in the history
17:39 of mankind. Um I know you think they're great. I actually just think they're
17:42 okay. I don't think they're bad. I think they're better than most of the outbound
17:46 emails I'm going to get during this day during AI day. The ones we send are
17:49 better, but they're not the best I've ever they're not something you can spend
17:53 an hour crafting. I think the number one key and that's why the 60,000 key is
17:59 cool. We are touching folks lapsed folks we forgot to talk to folks we don't talk
18:06 to enough more often we are connecting with more people more often not spam
18:11 right but we couldn't do 60,000 highquality emails manually or even with
18:16 old school outreach tools we just couldn't do it right. So, I think it's I think the key
18:22 to this Tell me if you're wrong and then I'll we'll be quiet. I think it's just
18:26 more high quality pretty good interactions. >> Yep. >> That's the thing. We're getting scale.
18:32 And that's why what I think when I think about everything we've learned and
18:34 you've done most of the work, Amelia, if I had to advise people that are earlier
18:38 on their journey, find something in your go to market motion that just isn't
18:42 getting done or is getting done very at a very mediocre level.
18:45 >> Yep. >> Then put an agent. Don't try to replace
18:50 what's working well. Do that as your 10th agent or your 20th. Like literally,
18:55 we're such a tiny team. We just weren't reaching out to enough people in our
18:59 base, in our activated base. And so that's the lowhanging fruit for us. We
19:03 just could not do sure any you could put something in a dated outreach sales off
19:08 cadence, but that don't work, right? But we never would have done this otherwise.
19:13 So my find that low hanging fruit, the stuff in your dom that you're just not
19:16 getting to. The customers that are too small, the customers that take too long
19:20 to respond and your team doesn't want to do. The customers that have low have,
19:25 you know, everyone's that have lower scores, right? That that but they're
19:29 still they still have intent, but no one wants to call them back. Do those ones
19:33 for whatever your low fruit is because then even if you get some yield is it's
19:38 it's magical. >> Yep. I agree. So I think again a little
19:44 bit of a misconception here related to some of what of the chatter is the
19:50 formula for us is to copy your best human like as you're deploying maybe
19:53 you've already deployed one agent maybe you're deploying your next agent to
19:57 Jason's point right do something that you could either get a lot of scale out
20:03 of by adding an agent and do pretty good there's just some things like the agents
20:08 can't slash shouldn't do like obviously we love agent Like I love our agents. I
20:11 use a lot of them. I add, you know, I just added the AIVP of marketing. We'll
20:15 show you guys. But there are some things I'm like the agent would just suck at
20:18 that. So I'm just not going to do it. It's just like there are some things I
20:22 still need humans to do. Like a lot of the production stuff we're doing for SAS
20:25 annual. I'm like I still need a human to do that. Dude, the Asian is not there
20:30 yet. But what we also mean by copy your human is if you're going to add AI
20:35 agents, add scale to help you scale, right? is on scaling more emails, more
20:40 more meetings, more clicks, more volume. Figure out what works first. I see too
20:45 many people who, you know, okay, they want to automatically give an AI SDR to
20:51 their SDRs. And I'm like, okay, well, are those SDRs new? Did they just join
20:54 one? I don't think it's a good idea to give it to every single SDR. I think
20:58 that's you're gonna there's a lot of reasons that would get you into trouble
21:03 fairly quickly in terms of workflows. But I but I also think if you don't know
21:08 what works first, I don't get this mindset of like, oh, it didn't work, but
21:12 I'm just going to add AI and it'll magically work now. Like, no. If it
21:16 didn't work or wasn't working before AI, it's not going to magically work now.
21:19 And somebody asked me this question yesterday when I was doing a Salesforce
21:23 webinar like, okay, what if I'm a super early stage startup and I don't know
21:27 what works? And I was like, well, do you have any customers? They're like, yeah,
21:29 we have like, you know, 10 paying customers. I'm like, "Well, go ask your
21:33 customers why they bought you." Like, everybody has at least a few customers
21:36 or maybe if you're super early stage and you've got a few folks on a trial, just
21:41 go ask them. Like, figure out what worked. Figure out what got them in to
21:44 your product and is getting them hands- on product. Figure out what works first,
21:49 right? We had so much data that we ran through before we put it into any of our
21:53 agents on what was working, right? the best the best a like the best email copy
21:58 for doing outbound the best responses of how we should follow up with inbounds
22:04 you know the best um contacts and verbiage about SAS you know about SAS's
22:09 events about sponsoring SAS like we went through all this data we went through
22:12 all this context we flagged everything that was the best of everything before
22:19 we put it into any agents any AI etc whatever so you know train the agent on
22:24 what works best. I think I see too many people now falling into the trap of they
22:28 want to add AI into something new. And sometimes you can and it it will work to
22:32 some degree, but I think if you do it and you train it on the best of
22:35 everything, it will work that much better, right? It will get you to pretty
22:38 good to Jason's point. Like it'll get you to pretty good emails. They still
22:42 may not be the best on planet Earth, but it'll still you it will I think it'll
22:46 put you over that bar of pretty good versus crappy AI emails that we've all
22:50 seen or even crappy human emails that we've all seen. So yeah, that's that's why I convinced
22:55 on it. But yeah, I think you know, you have to trade it on the best of
22:58 everything. And if you don't know what that is yet, I would take that time,
23:02 take a week, figure that out before you know you deploy your first or next
23:07 agent. So that and then see where that gets you. I feel like you'll have a
23:12 better output because we are constantly iterating our agents now to make sure
23:16 they're they have the best of everything and that they know everything that we
23:22 know like as we know it, right? So like as we get, you know, speakers for SAS or
23:26 we have new things that we're doing or now we've got like lounges and stuff or
23:29 like new things in our sales process like we I'm constantly making sure the
23:33 agent knows all this so that I can talk Okay, so I want to address some of the
23:40 questions on the chat of like you know some of folks are asking about
23:44 evaluation tools like what's our processes and then this is the 9010 rule
23:50 that Jason came up with but I I really do agree with and I think it's a good
23:56 one which is you know buy 90% of your AI stack and I'll talk about the evaluation
24:00 process we've done in a second and only build the 10% where there's where I
24:05 where I think there's you know there's no vendor that can do this Well, and
24:10 it's either a P1 prior priority or as you'll see in like our AIBPM, you know,
24:14 I built that agent cuz it was a commodity like it was something where
24:18 even with all of our agents now, I was like, I still have so much data from
24:24 Saster internally that I want to act on and I want to deploy this agent in a way
24:27 that maybe I don't need it to run everything automatically. So, it's a
24:32 very specific use case, but that was where I kind of built that and that's
24:35 where that kind of fostered in from, right? It's like, okay, I had all this
24:38 data. I wanted to do something that was more internal facing, not necessarily
24:41 external like a lot of our go to market agents are. Um, and so in that case, it
24:46 made sense to build. I'd say for a lot of things, it doesn't make sense to
24:49 build, right? Like, um, if you guys listen to the podcast Kyle and Jason
24:53 did, I think we put it up like last week or something. Kyle, who's the CRO of
24:57 owner, talks about how, you know, he's also kind of roughly followed the 9010
25:01 rule of, you know, he's bought a lot of third party agents. um he's made them
25:05 work and then he hired somebody who was like a I think it was like a former
25:09 founder or something right Jason to like build a proprietary in-house tool and
25:13 that's like one extreme right but like even that 10% that he's building
25:17 in-house like he hired somebody who like was a CEO was an engineer like knew how
25:21 to code that like I think he was like a CEO of like an LLM company or something
25:24 like knew all this crazy stuff and like could build a proprietary like internal
25:29 agent but again for a lot of things it Um, I think too just to address some of
25:38 the questions on the chat of like what what's been our criteria. Um, and we've
25:42 talked about it a little bit before, but when you're evaluating these tools for
25:46 the 90% you want to buy. Um, I think the important thing is to
25:51 one, you know, again, I don't know why I think in the age of AI people sometimes
25:55 will will throw things away because they're like, "Oh, there's this shiny
25:58 new object." I literally asked all of our all of these AI tools that we now use and
26:04 deploy for help. I was like, I need one like one, I'm going to need help. Like
26:08 I'm going to need an FD and two, let me talk to people who have used this. Like
26:12 I think I see too often folks are like, okay, it's an AI tool and so I'm not
26:16 going to ask for a customer reference. Like ask for a customer reference. I do
26:20 these all the time now. Like I try to make them as short as possible now cuz
26:24 um you know we do these webinars and stuff too, but I do these all the time
26:28 now. like Marshall from Mango Mint, Kyle from Owner, like we do this all the
26:31 time. Like Phipe from Persona, we do this all the time now. People ask us
26:34 constantly for like, you know, a customer reference. Like it's con like
26:38 ask them for a customer reference and if you can ask them for one like in your
26:42 vertical, see what they say to you, right? Like if they push back, maybe
26:46 don't use that vendor. Like they Yeah, most of these folks have
26:49 at least one customer that's slightly like yours. If they don't have one in
26:51 your vertical, maybe you can give them a pass on that. like at least talk to a
26:57 customer and then see how much they will help you, right? I think a lot of these
27:00 tools to their credit for the third party tools we do use now have been
27:04 helping us along the way, right? Like there's there's some of this like we've
27:09 learned from just now deploying so many agents, but some of it was because they
27:13 put an FTE on our success team, right? Like Salesforce put an FDA on our
27:18 success team. Artisan is unique in that, you know, anytime I have an issue or or
27:23 I have an idea, I just, you know, the CEO or the head of product, you know,
27:28 qualified. There's an FTE on our success team. Oh, you know, Replet, we have an
27:33 FD on our success team. There's just so many cases here where if you ask them
27:37 for that, they should give you some level of that service, right? Like to
27:42 make it work cuz they should want your business and they should want you to be
27:44 successful. Now, it doesn't mean that you need to have an FTE like every week.
27:48 Like now I meet with them a lot less often than getting started, right? But
27:51 you should ask them at the very start at the very least to have some FTE time at
27:59 >> Say one thing on tools. I know this is versions of things we've said since the
28:03 beginning. When you're talking to a vendor, >> if it doesn't feel right, don't buy it.
28:07 >> Yeah, >> it should feel right. It should feel
28:11 a lot of folks flame me a little bit when I say a lot of agents should almost
28:17 get you going for free, right? And a lot of the agents can't do that. There's
28:20 economic reasons. There's headcount limits. People can't really train you
28:23 and deploy you for free. >> But if you look at like um the 20VC that
28:29 I did with Harry and Rory when Mark Beni off came on, it was interesting when he
28:32 said he wished he could. He said he can't at Salesforce, but he wished he
28:36 had enough FDEs that everyone could be in production on Agent Force before they
28:41 had to pay. It's not practical, but the best ones take you as far down that
28:45 journey in the Age of AI as they can. The pro they're proud of their products.
28:48 They'll show it to you. If something doesn't smell right, if it doesn't feel
28:51 right, if you don't think it's going to work, it won't work. Buy another one.
28:56 Even if the brand's less good, even if it's scrappier, even if whatever, if it
29:01 smell, if it doesn't, if your Spidey scent says this agent isn't going to
29:03 work, don't buy it. [snorts] >> Yep. I agree. I think too, like um yeah,
29:09 that's a that's a the point you made on the free trial that like a lot of agents
29:13 cannot set you up for free. That's a really good point in the evaluation. So,
29:18 yeah, we you know, we threw down for these agents. >> It makes it hard. It makes it hard to
29:22 take some risk. It is interesting. I want to say it is interesting that when you look at the
29:28 proumer AI tools that we highlight, all all all the Reeves and the gamas,
29:32 they're lucky because you can get so much value for free. Even forget about
29:36 29 bucks a month or 99 bucks, actually the free products are great. Like try
29:39 those tools. The problem with AIGM tools is like even if they want to do it, they can't do it,
29:45 right? So, you've got to take some risk, but um maybe not later in the year,
29:52 but you know, don't do it if it doesn't feel right. [snorts]
29:55 >> All right, that's our kind of like build versus buy rule. And then once you get
29:59 to this point of the process, like um something I wanted to address, which is
30:03 also the title of this our talk today was what does that look like in reality
30:07 once you get into multiple agents, right? And I'm going to say something
30:12 today that it's not so simple. Don't let that scare you. Don't let that like
30:15 frighten you off of doing more than one agent and maybe you stick to one and it
30:18 works really well and that's fine. Like you do not everybody needs to be I think
30:22 on a [clears throat] multi- aent management journey but just know that if
30:26 you are you if you're in that journey today for ourselves and what I've heard
30:32 from some others is it's kind of all bandated together. [laughter]
30:36 There's like a you know there's kind of a big reason folks like you know
30:39 Salesforce are having like a big renaissance like because a lot of these
30:43 thirdparty tools we use for instance and for example like push back to Salesforce
30:46 or we push all the data back to Salesforce with like a zap or you know
30:50 whatever or some of them have a native that they can push records and update
30:55 records back to Salesforce and so a lot of the time right now it looks like you
30:59 know all of our third party tools whether we're a APIing into them or not
31:03 or using things like a Zap year then we have all of our internal data our VIP
31:07 ABS, right? You're we're pushing all that back into things like, you know,
31:12 cloud, Zapier, back into things like Salesforce as our like system of record
31:17 just to keep all the records up to date somewhere central. But, you know, that's
31:23 not native now, right, for now. That's not native at this moment. And so, it
31:28 takes a lot of web hooks. If you haven't heard this word, you'll probably learn
31:33 it fast. We have so many web hooks into our Zap year account. I can't even like
31:36 count them, right? Like we have so many web hooks just firing all the time to
31:40 like push things back, but I'm pushing them again as into one kind of thing.
31:45 And for now, that's like Salesforce cuz it's it can ingest all this data and
31:49 take all the context for our agents. Um, but and not to say like you could say,
31:53 okay, I don't maybe need that data everywhere all at once. Um, but I I like
31:58 to have it. I like to, you know, I like to build the context of the agents from
32:03 one agent to another. Um, and to sort of let it build on itself, we use a lot of
32:07 web hooks. You know, we use Zappier. I know N is having like a renaissance now
32:11 cuz it's kind of the same thing, but just built in the age of AI. Um, but
32:16 whatever one you use, right, you're going to you're going to see quickly.
32:19 And I've got a screenshot of it. You end up with a lot of like different hooks
32:23 and kind of like hodge podgeing things together. Um, but I think it's just for
32:27 now, right? I don't think that's a problem for always. I think it's just a
32:32 problem for now, you know, in the first half of 2026 to to, you know, have it kind of web
32:39 hooked into things that you need to make sure you can control the flow of what
32:43 your agents are are doing and where that data is ultimately pushing back to and
32:47 pulling from. I do think you should pick one source of truth, right, at the end
32:52 of the day to store some of this and then build further context for your
32:55 agents. you know, we put Salesforce, you could pick up spot or something else.
32:59 I think also to get used to your agents talking to each other on their own, you
33:05 know, it happens. Our agents talk to one another. It's fine. Get used to also as
33:11 a human like talking to your agents. Um, it is kind of a weird thing to at first
33:14 get used to and then you'll get used to it. And then also get used to, you know,
33:19 for now copy past basing context. Like we do a lot of context sharing between
33:21 our agents. like, yeah, some of this pushes to Salesforce. But sometimes I'm
33:25 like, you know what? I don't want it to push through that flow. I'm just going
33:27 to copy paste something in this context from one agent and then put in the other
33:30 agent like the way that it it understands context. And so, again,
33:35 that's not necessarily the simplest or the cleanest path um of multi- aent
33:40 management. And so, I just wanted to be for real about that, that in today's
33:44 world, that's what our reality looks like. But that's also because,
33:49 you know, we use a lot of specialized tools. Like there are obviously I know
33:52 there's like all-in-one agent builders out there. Some of them are some of them
33:57 are coming to us or dismay. But for us like you know I like to use the
34:01 specialized tools. I just still find that the output is a little bit better.
34:05 Like I like to I like to use the best of everything in each agent versus like an
34:09 all-in-one tool um that can build multiple agents. I just for us it works
34:14 better for you. you might see success in using an all-in-one tool, but for you
34:18 know that could build different agents across the board. But for us, since we
34:22 use like very specialized third party agents, this is like the reality we live
34:26 in. But you might not live in it if you pick one system that can do multiple
34:30 agents, you might just have to manage one from there. If you're like, okay,
34:33 you know, I'll trade off maybe some of the quality for quality of life and
34:39 managing all the agents, then then it [snorts] All right. So, what do I mean
34:47 by this? In reality, right, this is a screenshot of one of one.
34:54 This is a screenshot of one of my zaps. I'll explain to you what's slightly
34:58 happening here because this is a good I also wanted to show people like a go to
35:02 market flow they could copy um maybe not necessarily at the same like degree or
35:05 scale but this is one you could feasibly copy slash iterate on for yourselves
35:09 right once you get to multiple agents so you'll see it's catching a web hook
35:15 I think this web hook is s annual if I remember which one I screenshotted um I think this one is
35:22 annual it's catching a web hook because there's like a you know there's a lot of
35:26 forms on our website. Um, and we vibe coded the website. And so it's got a web
35:31 hook when you fill out the form. Um, and so anyways, it's catching this.
35:35 Basically, a web hook is a listening tool. If you don't know what a web hook,
35:39 like it's listening to say, okay, um, in this case, when you submit a form, the
35:44 web hook is going to catch it anytime it has a submission and then tell me what
35:47 to do with that hook, right? So, it's basically capturing that data. So, it's
35:52 catching the hook. It's porting that submission one to a Google sheet cuz I'm
35:56 crazy and I just like backups of everything also in Google Sheets. Like
35:59 again, you'll see like you literally seen in this flow it's going to
36:02 Salesforce, but I also just Yeah, just sometimes I need a quick Google sheet.
36:07 Sometimes it's just nice. So, it's pushing to Google Sheets. You'll see
36:11 it's pushing to Salesforce. So, um you could do this on contact or lead. Um,
36:15 you know, it also depends on how like we're in the flow where we have agent
36:20 force and so, um, it's it's ours is triggered off contacts. You can trigger
36:23 yours off leads. Ours is triggered off contacts and so it's creating, you know,
36:28 a contact in Salesforce. It's adding a contact to a campaign. Now, in number
36:32 four, I circled it because I said, you know, we can pick when it adds a contact
36:36 to campaign if we want to send it to agent force already in this app, right?
36:39 cuz I have certain campaign triggers that say, "Okay, when they're added to
36:43 this campaign, trigger the agent to turn on." So again, you don't necessarily
36:46 need to do that if you're not ready for that yet, but it's something you could
36:50 do here feasibly, easily um and do it a little bit more automated, right?
36:55 Then you know, it's going to um find those records of those find those
36:59 records of, you know, the company. Um there's a little this is a little
37:03 misleading because it sounds simple, but it's finding the company records, right?
37:07 So since this is a contact level contact that it's created um and triggering to
37:11 agent force potentially now it's going to find records of okay basically I'm
37:16 asking Salesforce to see what is this company on the account level because we
37:21 use account level contact uh records what has this company done with us and
37:24 so I want it to find those records of what that company has done with us and
37:27 then you know I want it to get the record attachments if you use clay you
37:33 can use it here in a very kind of fun way to say, okay, if I already have a
37:38 table in clay, you can have it like summarized for you and then also like
37:42 look at LinkedIn and say, okay, what else is this person actually also doing
37:46 on LinkedIn? What are they doing? What are they posting on social media, for
37:49 example? So again, you can get more context. You could skip this step if
37:56 you're like not into using a clay table, but that's a fun way you could do it
37:59 there. And then you can send a Slack channel message to send you all this
38:03 send you all this context of like, okay, here's the, you know, here's the contact that I just
38:09 added to the campaign. Here's the account information about it. Here's
38:14 the, you know, clay contact about it. And then I'll send you a Slack about it.
38:18 And then if you really want to, you could do things like make a gamma. Like
38:22 if you wanted to make either a landing page or a presentation for this person
38:26 to send in their email about, you know, let's say how to use Gamma at Saster or
38:31 whatever, like how to use whatever your company is for Saster. You could do a
38:36 super complex flow like that. Have it make you a draft presentation or landing
38:41 page to send to you. And then, you know, in Gmail, you could create a draft
38:44 ultimately send to this person if you want to do it that way. Again, this is
38:47 just a sample go to market flow. You can see I didn't like fully set up my clay table because
38:53 I'm just speeding through this. But again, this is a good sample go to
38:57 market flow. You'll see it's like, you know, it's got agents kind of layered in
39:01 it. There's like an agent force layer in it. There's a, you know, if you consider
39:04 a clay agent, there's a clay agent in there. You know, this one pushes to
39:09 Gmail, but if you have an AISDR email platform, you might want it to
39:12 push to that platform. But, you know, all that I think is just
39:18 important to see as an example. >> in this multi- agent management sample
39:25 flow. Right? Again, this is just a sample flow of how you can feasibly kind
39:31 of manage agents, which right now for us is somewhat messy, but it looks a lot
39:35 like these Zapier flows. It's a lot of Zapier to Salesforce to other things to
39:42 APIs to whatever. And so yours may or may not look like this. I think a lot of
39:45 times folks will be like, "Oh, you guys have 20 agents. Like, who are you using
39:49 as your MCP?" I'm like, "We don't have one." Like, we don't have a true like I
39:53 don't consider this Zapier or Salesforce thing a real MCP. I consider it a MCP light,
40:02 but like it's not like if you truly look up what an MCP is, it's like it's not a
40:07 true MCP. like yes like the context is sharing back and forth and you can kind
40:11 of get there on Zapier and Salesforce but I again I call it light MCP in air
40:18 quotes cuz it's not really an MCB and so many people have been asking me that
40:21 lately cuz they've seen you know all of our content or our agents are like yeah
40:24 you know what do you recommend I use for my MCP I'm like I'm I'm not using one
40:31 truly like this is my MCP it's a lot of human work so [laughter]
40:37 again this may not be your use case, but this is how we've done it. Okay. Uh I
40:42 just want to deep dive into two quick things because I feel like there are um
40:47 a few related questions to it. So I have a few deep dive slides on the AISDR and
40:52 then a few deep dives on our AI VPM that I'll just quickly touch on and then if
40:55 you guys like this content I can go fully I don't know I could do more
41:00 questions at another time um on another Wednesday that's not AI day. But yeah,
41:05 on a quick deep dive, I think um things to keep in mind if you're because a lot
41:09 of you in the chat seem to be rolling out like your first AISDR. Now, I think
41:15 a few tips and tricks just agnostic of any tool that you use. I feel like this
41:20 is good um hopefully good advice across the board regardless of what tool you're
41:24 using, which is one to treat each outbound segment dynamically. And what I
41:28 mean by that is like even across our you know multiple agents we have for AI go
41:33 to market. I don't do like I see people do one campaign for like 10,000 leads.
41:40 I'm like no I max my campaign to like 100 500. Like I want each campaign each
41:47 sub agent to be highly customized highly trained to the exact segment that it's
41:52 going after. not like a broad hey have you heard about disaster like no I want
41:56 to say okay these are my outbound segments I put a chart here on the right
42:01 that I made um for our outbound AISDR funnel hopefully it's helpful but I
42:05 treat each of these dynamically and I train each sub agent dynamically on each
42:10 of these things so that the output to Jason's earlier point is pretty good
42:14 right okay maybe it's not great but at least it's pretty good because
42:18 everything is tailored the audience is hyper segmented the messaging is hyper
42:22 segmented. The training is hyper segmented. Hyper segmentation in the age
42:27 of AI with these agents is your friend. Don't go don't spray and pray, please.
42:31 Like don't do that with your agents. I see a lot of people do that. It's that's
42:36 how you get the bad emails off the AISRs. You know, another another way to
42:41 think about it too is is to not think about it in the human ways of
42:45 segmentation, right? Like a lot of times um classic outbound would be, okay, I'm
42:48 going to do it on the geo of where they're based. I'm going to do it on
42:50 their title. I'm gonna do it on their roll. You can see on my chart, none of
42:55 that exists here. Like I'm not doing any of that super highlevel almost
43:00 artificial segmenting. We do it hyper segmented. And the reason we do this is
43:05 to as I bolded it here, give your agent context. Right? If you're already used
43:10 to using chat and claude, what you're doing with those agents every day is
43:14 talking to it, giving it context, telling it about your business. That's
43:18 the same thing you have to do for these AI go-to market SDR agents. You have to
43:23 give your agent context and the more context you give it, the better the
43:29 result will be. And so that's why I hyper segment everything list,
43:34 messaging, targeting, etc. All hyper segmented to the AI STRs. And that's
43:40 across all of our AIDR agents, right? You have to give your agent context for
43:45 it to understand who are you trying to reach out to. What are their specific
43:49 pain points that your problem and your tool can solve? And then I'm going to
43:53 use, you know, my classic AI of I can scrape the internet, see what their
43:57 company is doing, and relate it back to them. And so you'll see in my outbound
44:03 AI SDR funnel, none of this is like cold leads and none of it is like geo or
44:08 title or location. And I think too this good lit I don't know how long this list is. 12 things
44:15 like for most of you start here. Like start here with your AI SDRs. Too many
44:21 folks I see now are doing AISDRs. I'm just going to let it loose on cold
44:24 outbound because that's what our human SDRs don't want to do. I understand your
44:27 human SDRs don't want to do cold outbound to people who don't owe you,
44:31 but neither does your AI agent because your AI agent does not have context for
44:35 why you should be reaching out to this person. So, same rules apply here in
44:42 outbound AI SDRs, you know, start with start with the hot people, the people on
44:45 your website. Um, a lot of these AI agent tools can deanmize some of your
44:49 website traffic to email them, people who have inbounded to you. If you have
44:53 like abandoned carts or trials or you have event leads, start with all the hot
44:57 people. Do the people who like, you know, was a customer, maybe they changed
45:01 jobs. Do current customers like we get this all the time. I'm like I email
45:04 people who like bought a ticket for lemon to come to Sster annual in May in
45:08 NASA and I email sponsors that are like current customers to be like hey we
45:14 added a bunch of new stuff. I think too many folks kind of skip using AI for
45:19 expansion but it's a great way to do it. You know if you have recent marketing
45:22 leads because you're doing something like a webinar like this or you've
45:25 gotten ebooks or gated content or you spent some money on some sponsored media
45:29 and you got some leads. Put those people onto the agent. leads we never followed
45:33 up with that we famously gave to agent force. Again, the list goes on. You
45:36 could see what I mean hopefully here. Like, there's so many hyper segments you
45:41 can give your agent before you give it a quote unquote like cold lead that knows
45:44 nothing about you that you should start here. And a lot of the reasons why you
45:49 should start here is not only will it give your agent context, it will give
45:52 your human team context on what works and what doesn't. So that by the time
45:56 maybe you exhaust this list, I still haven't exhausted this list after 8
45:59 months, but maybe you start to dwindle down this list because you don't have as
46:04 many contacts. Then you can start to do, you know, okay, the then you can start
46:09 to do the AI truly cold outbound to folks who maybe don't know you. But at
46:12 that point, you're using what worked. Again, this all goes back to what works,
46:16 right? Like at that point, you you know how to train your AI agent. You kind of
46:20 know what's worked for these audiences. then you can you can make a very
46:23 informed guess on what would work for a All right, hopefully that's helpful. I
46:30 think the other quick thing just across the board and then I'll go into our um
46:36 AIVPM that we built and try and do some quick questions is um you know AI is
46:41 great because it can adjust then everything right we have it you again
46:46 ingest the best of everything your best case studies your best everything right
46:53 but also tell it what you can't do and I think this is a super important nuance
46:56 that I've only learned after 8 months now I used to just be like okay here's
47:01 the best of everything. Super good to stay in these stay in these boundaries.
47:07 And then over time, because AI is the agents are so self-gratifying, it's
47:09 trying to beat itself, right? It's like, "Hey, Ameilia, I did pretty good." And
47:13 so now I'm going to start to maybe either make stuff up or try and beat
47:19 myself with my opens, clicks, meetings, and I'm going to start to say things
47:24 that maybe you didn't put into the context of the agent. And so I quickly
47:27 learned a couple months in actually that once you start to do this at scale, it's
47:35 maybe just as important to tell your AI agents what you can't do and what you
47:41 can't do. Like I have now told it, you know, okay, we don't do that or we don't
47:45 do this or we don't do that. Uh you know, we don't don't offer people like a
47:49 speaking slot. Like yeah, we have speakers at SAS, but a lot of people
47:54 apply to speak. Send them to, you know, the the content committee submission for
47:59 like do that instead. So I think that's just an important nuance
48:02 I've learned over time. So hopefully that's all for for you guys to know now.
48:07 Hopefully earlier in your journey that I kind of learned it the hard way cuz it
48:11 sent some it sent some emails it shouldn't have of things that like we didn't do and I
48:17 realized it was because I didn't tell it that we couldn't do those things, right?
48:20 Like it was just ambitious like in the way that maybe a human SDR would be like
48:24 oh I don't know I think we could do that or oh I think that's on the road map
48:29 classic, right? And so the AI agent did a little bit of that. And so I think
48:33 it's important to now to say, okay, here's what we can do. Here's what we
48:40 [snorts] Okay. Uh this is a little context, but I want to go through I'm going to upload
48:47 these slides for everyone. So just ask.com. So don't sweat it. I'll also
48:51 send it to you. We still reply to everything. Um maybe just the last uh
48:57 tidbit on AISR agents is you know if you have found bad foundations and what I
49:02 mean by that is bad context that's where you'll see bad emails right bad context
49:07 equals bad emails honestly this bad email I put on here I actually think a
49:10 human wrote to be honest it was written in a way that I actually don't think a
49:15 AI wrote it but it was written in a way where like they're just bad these are
49:22 truly bad but I have I have seen AI STR emails that are of this quality.
49:27 I do think these are two human emails cuz this person didn't actually know
49:30 where I worked. And I was like AI would have gotten right where I work. So I
49:36 think human wrote that cuz that seems like a very basic mistake that an AI
49:41 would not make. So um I think that one's a little funny. But
49:50 anyways, uh yeah. So, this is one where like again uh oh, I guess I didn't put
49:54 the screenshot of what Sorry, there was a I meant to add another screenshot
49:57 where they got the company that I worked for wrong and I was like that's not an
50:01 AI, that's a human. Uh, but I'll add it and then I'll show the slides. But, you
50:04 know, this other one of like, okay, again, this person wrote this. I'm
50:07 pretty sure they wrote this as a human. Maybe I'm wrong, but they just did it,
50:11 you know, based on again things I never segment for in an AISDR. They did a
50:16 based on go of like where the office is in for Saster. They did a base on my
50:22 role at Saster. But clearly, again, I think a human wrote this did not look
50:25 anything up because they wanted me to use their tool where I already I I
50:29 literally mentioned that tool on this call. So I was like, well, like at least
50:34 reference that or like be acknowledgeable that like I'm already
50:38 using a different AISDR that would have like, you know, told me that at least
50:43 you listen to something or your LLM listen to something I did that knew that
50:47 I used this product. But like just saying like are you thinking about using
50:50 you know what are you what are you what are your priorities for 2026? I'm like
50:55 dude this is such Anyways, this is a bad email. Bad foundations, bad context
51:01 equals bad emails. Um, but also, you know, there's still plenty. Oh, here's
51:04 the other one. Yeah, this is the other one that somebody sent me. um that we
51:10 yeah again I think a human wrote this not a person because they got the
51:13 company wrong and I think an AI would get the company right but clearly on
51:16 mother's webinar talking about Sasker so I don't work at Forester and never have
51:21 I ever worked at Forester so I don't think a I think a human wrote this and
51:27 just copy pasted um and again not an AI because AI you know I think 100% of the
51:33 time or maybe 99.9% of the time our AI agents know where you work like it's
51:36 pretty uh like maybe if I had worked there previously I would give it a pass
51:40 but never ever worked there so I don't give that one a pass. All right, in the
51:45 last few minutes our newest AI agent that we built and why we built it and
51:48 then you know we could do a follow-up to this cuz 5 minutes will not do it
51:54 justice but we did not find a viable third-party marketing agent that could
51:59 do more than content. Right? A lot of the marketing tools out there for true
52:03 go to market do a lot of content related activities. The real problem we had was
52:08 orchestration, you know, based on data, based on already having other agents, based on
52:13 having, you know, proprietary agents, whatever. Like I had a need to build an
52:20 agent and I also knew we had a track record where like anytime I try to
52:24 onboard an actual human with all this data, they got overwhelmed, right? And
52:29 so I was like, okay, what can I do now, knowing what I know now, 8 months in, to
52:34 really capitalize on getting an agent to work that could, you know, push us, keep
52:39 us on track. And then ultimately what our AIBM does is actually tell me what
52:44 to do. >> Just like most CMOs, >> I know >> they don't actually do the work. They
52:50 just tell everybody what to do. That's the dream job. >> That's the dream job. But the difference
52:56 on my agent is it at least it uses data to give me what to do. Right.
53:00 >> I see. That's what they did five years ago at their last one.
53:03 >> Not just like, hey, I used this playbook at my last company. I'm going to do it
53:07 here and bring in an agency and like a bunch of people. At least my agent was
53:12 like honest about, hey, here's the data. Here's where I think you're falling
53:14 short. Here's where you should double down. Here's where you should spend
53:17 more. Here's where you should hire a person. like it literally gave me all
53:22 this output which was quite nice. So yeah that's a little funny that's a good comparison.
53:29 So this is a quick slide on how we did it. I took a bunch of data from our from
53:34 our agents from our third party tools from internal data that we've had over
53:37 the not all of our data just some because it's a lot. So I I I
53:40 cherrypicked kind of some of the best data because that I wanted it to action
53:44 on you know I looked at our our Zapier workflows I looked at you know
53:47 Salesforce I took all of it. I pushed it into um Claude just for purposes of this. Um, and then
53:57 I I took that I took what I did in cloud and I pushed it into Rubble just so I
54:00 can make it into a website that the rest of the team could access because I was
54:05 like, okay, obviously my cloud is for me and I I don't that's not like a good
54:09 maybe for good reasons. You know, there's not good team sharing on like
54:13 specific chats and so I I pushed interrupt so I could make websites to
54:16 share with Jason and David and then like some of our production team at Aster. So
54:20 we built our own. And we nicknamed it 10K for a lot of reasons, but at the end
54:25 of the day, how I built this custom agent was I already had something in
54:31 mind of what I wanted it to do. And so in that, I had a very clear goal in mind of like I
54:38 wanted to go to the first 10,000 the first 10,000 attendees for SAS annual in
54:42 May and the first 10, you know, 10 million of revenue for this year. So I
54:48 gave it very clear goals when I built this agent. I gave a very related like
54:53 context and data that related just to those two goals and those two things.
55:00 And basically what the architecture was was it used a lot of claude um opus
55:06 right which is like kind of the I will say I had to upgrade to max to use cuz
55:11 my pro account ran out of memory and it [snorts] did take me a weekend. This was
55:15 over a weekend I did this. Um, I had to upgrade to max, which now I love, but it
55:19 at one point it was like, you know, I was on the pro plan. It was like, you're
55:23 out of memory. Please wait until three. Please wait until 7. I was like, okay,
55:27 I'm just going to upgrade. I had upgrade that cuz I was using LM a
55:31 lot. But I had to, you know, analyze all these things, all the emails, the data,
55:36 what's worked year over year, the registration patterns, the time of day,
55:39 and like when do people buy a ticket to sales manual, when do people buy a sponsorship? Um,
55:45 all all plus all of our recent again all of our recent agent interactions. I put
55:49 I shouldn't say all. I put some of the recent agent interactions in there so we
55:53 could see how agents and people were interacting with SAS. I felt like that
55:57 was an important context to give it as our AIVBO marketing. Um, and then you
56:03 know I I I told it to give me an analysis of you know the next six
56:07 months. Give me a road map of everything we should be doing. And I said, "Give me
56:13 high level and then give me uh full details." And I'll show you that
56:17 in the next slide. I was like, I need every single marketing initiative at
56:23 again a high level and an actual daily executable task. And I told it to give
56:26 me that, right? I think this is super important so that you don't just get a
56:30 bunch of like generic strategy ideas. Um I told it I wanted an executable task. I
56:34 wanted them ground in the data and I wanted it to be easy enough to follow so
56:39 that we could the humans here the three plus one dog could still execute it. And
56:45 so that's where again I think a lot of this just was a culmination
56:51 of our using agents and I kind of knew what I wanted and I knew what context to
56:55 give it. I knew what data to give it. Um I don't recommend just building your own
57:00 AIVPM today. um you know to other agents first. But it's interesting because a
57:05 lot of stuff I was doing it said to blow up or abandon and then some stuff it
57:09 said to bring back and then there was a bunch of new stuff it told me to do. So
57:13 again since it was based in data I'm trusting it on what to do and how to run
57:18 these campaigns. And so you'll see here this is kind of again you'll see at a
57:21 high level it's giving me a week over game plan for uh cumulative tickets. So
57:25 this is important that's and I told her this would be cumulative to everything
57:29 else we're already doing at Saster but just give me cumulative ideas of what we
57:32 can do to you know get you know maybe instead of have 10,000 we have 12,000 or
57:37 15,000 people it's like just give me a cumulative ideas to get you know couple
57:42 thousand more folks to tapster. Um and so this was the game plan it came up
57:46 with. So you can see, you know, in the early weeks, it's like, okay, you can do
57:49 some early bird stuff in January, you can do some alumni stuff. And then when
57:52 you click each of these, it has literally what you should be doing, the
57:57 email, the it knows I'm using an AISR, it knows some of our AIS, like again, I
58:01 do have a context writing sent. This is what to do with qualified. This is what
58:04 to do with artisan. This is what to do with 184. This is what to tell Jason to
58:09 do on his social media. It literally gave me all these. It told me how much
58:13 to spend on LinkedIn, what the LinkedIn ad should be. like it is that granular
58:17 level of yeah you can see on the left it's high level but then it's also super
58:22 granular and so I think this is where it's been a important like back and
58:28 forth between us of just seeing you know what works and what can be done on an AI
58:34 stealer I think it's important too of like what you know 10K as we nickname
58:40 him can do a lot there are some things though it can't do right so like because I
58:49 built this as an internal agent because I built this as an internal
58:52 agent today I don't have it hooked up to these tools directly right now you can imagine
58:54 Future of AI in Marketing
7:57 Challenges and Maintenance
8:01 nearly doubled. I think that's just in the nature of the agents, you know,
8:05 especially when it comes to folks that are inbounding and also how it's doing
8:08 outbound, just having a lot more context, right? It's a lot better
8:12 context. It's a lot better outreach in some cases, but then essentially for us
8:17 like across me and David specifically in sales, like it helps us with our
8:21 conversations because we already know what this person has said to the agent.
8:25 We can see the exact conversation they're having. and we can see for on
8:28 the website um we can literally see what else our company has been doing with our
8:32 agents and we use that in the mean right so it saves a lot of time it's also a
8:35 lot more qualified and we get on that call and so I think a lot of that has to
8:39 do with some of the nurturing there and I think more importantly too in those
8:44 stats like it did not cannibalize our other inbound revenue sources those have
8:49 just been augmented by our agent and another thing I like to remind folks too
8:54 is like um it's not that we dropped a bunch of stuff either when we deploy
8:58 these agents, right? I think a lot of folks now will be like, well, okay, if I
9:03 if I just get an outbound AI SDR, then maybe I don't need any human SDR. And
9:07 maybe that's true, right? We we between David and I, we do some we do some
9:10 outbound ourselves, so we don't really have a human SDR, per se,
9:15 but I think there's a lot of pieces in which you would you might still need
9:20 both. line cuz we we do personally respond as humans to each message that
9:26 our AI agents produce and so there's a lot of you know time needed there that I
9:31 don't necessarily like the AI autopilot responses I still think it's better to
9:34 come from a person who actually knows the business and I think too in that too
9:40 I think um you know a lot of this has been augmented by our agents so you know
9:45 in helping us book again more meanings helping us understand the leads a little
9:49 bit Again, we did not drop the other things we're doing, right? We still send
9:53 marketing emails. We still do outbound. We still send gifts to people. Like we
9:56 still invite people to come to this house. Like all the things we used to do
9:59 before with the age of like, yeah, we could do them a little better, but we're
10:03 still doing them. And so I think that might be surprising to some folks, but
10:07 just know I don't I don't think it will cannibalize anything if you do it right.
10:10 But I also think it can definitely augment versus, you know, you don't need
10:15 to go after we have to necessarily replace. We've done that and it's
10:18 worked. But you may see that doing a mix Okay. So cool. But you know, here's the
10:30 um here's maybe the honest truth that that you may not see on LinkedIn or X
10:38 and that is that we maintain these apps every day. like literally even this
10:43 morning before getting on AI day as of you know I'm like checking our agents
10:48 and I think the important thing here is like the agents and the humans have to
10:54 rapidly evolve and change constantly like it's such a mind share killer for
10:59 myself for Jason like we're in these agents you know I think 15 to 20 hours a
11:06 week each that's each not like between two of us that's each of us constantly
11:11 like iter operating with our agents constantly seeing, you know, what are
11:14 they outputting, checking the responses, making sure it doesn't hallucinate,
11:18 making sure, you know, it is the it's talking to folks the way we want it to
11:22 talk to people, making sure it's adding value, making sure it's not degrading,
11:25 right? You sometimes you see these agents degrade over time. Um, and so I
11:30 think the important thing here is like it is kind of a real killer. It does
11:33 take a lot of time. I don't think you can replace the time of managing, you
11:37 know, we've just seen the time shift. Like the time we used to spend managing
11:41 slightly more people on our team. We now spend that same amount of time, if not
11:44 more, managing the agents, but it's just a lot different, right? Like there's
11:49 there's no people drama really. But the and the agents just work at a much
11:54 higher capacity and higher scale than a human being that it's hard to eventually
11:59 keep up with them. And I put this here in bold. I you know I've been trying for
12:03 a long time for the last few months to keep up with my agents and then I
12:06 realized it was futile because I can fordo it. Uh but I try and keep up as
12:10 best as I can. And in that what I truly mean is you know anytime we get a
12:14 response we have a system that it'll slack us from like any of our agents. So
12:19 whenever there's an interaction or agent is having a conversation with somebody
12:23 and it we want to reply and you know again we like to reply ourselves it just
12:28 like we do try and respond to those people literally instantaneously if not
12:32 in real time sometimes we are asleep and so we you know we respond to them as
12:36 first thing in the morning but I've I've realized I keep up with my agents
12:41 they're they're smarter than me. >> I'll tell you um just one nuanced
12:45 learning actually from this week. So, I was meeting last night with the CEO of
12:51 a ne a next we're already in the next generation of AI go to market agents um
12:55 that's already got millions of revenue um and is publicly launching in a few
12:58 weeks but they already have millions of revenue and um I asked I've known them
13:02 for a long time but I asked what the secret sauce was and the secret sauce
13:08 was they do everything they do the onboarding they do the tagging they get the first campaigns
13:15 running they do everything and they do it almost to such a fault that
13:18 [clears throat] some of the customers think it's too easy. They don't even
13:21 realize the energy that's going into it if they haven't deployed an agent yet.
13:28 And the the learning from that is just if you haven't deployed many agents or
13:32 any for real, you got to figure you got to have an honest conversation not with
13:35 someone in sales that doesn't know how the product works or use it themselves
13:39 with a forward deploy engineer with a leader and find out what is it going to
13:44 take to be successful upfront. the first 30 14 and 30 days and every day
13:48 thereafter and then you got to do it or it will fail. >> Yep.
13:52 >> And meet with the be the best of them. Um and you know if you don't know your
13:57 FTE, if you don't know the answers to these questions and you're spending any
14:01 material amount of money on an agent, you're wasting your energy. And so it
14:04 will be interesting to see where it goes this year. A lot of the agents we use um
14:09 are pushing down market to be more self-service. So far that doesn't work.
14:13 So far, I will say for the most part, agents that require deep training cannot
14:18 be self-trained. It it will come. Agents are getting so much better. That's front
14:22 tier one. But wait and see. Be skeptical. If you buy a cheap tool that says it's self
14:27 trained, make sure it works and you know the time. If you buy a more complicated
14:32 tool like we're talking about, just talk with someone senior enough on
14:36 deployment, not again someone trying to sell you something that doesn't know,
14:40 and be honest about what it's going to take. Um, otherwise it's like going to
14:43 the doctor and getting a prescription for medicine and never taking it. It's
14:46 not going to work. It's literally like that for an agent, right? Um, but this
14:50 was the first one I saw that could do like the type of stuff Ailia and I are
14:54 talking about, but without you doing any work. But it's because they have a huge
14:57 human team. They call them forward deployed AES in the beginning and then
15:01 other folks take it off. But that's an extreme case. I haven't seen any other
15:04 app like this that can just do it with no with not consistent training and work
15:09 every single day. every single day. [snorts] >> Yeah, we, you know, we do spend a lot of
15:13 time per week actively managing all of our agents. Just I think to Jason's point, be
15:20 prepared, right? It is, again, I think I see too many folks who they see the
15:23 really good stats and they get, you know, even our stats are, you know,
15:26 fairly good and they get a little like mesmerized by them thinking, you know,
15:31 okay, it's AI, I can just prompt it and it'll it'll do it fairly quickly. I'll
15:36 say too, like most of these agents have some sort of prompting in them, but
15:41 they're not necessarily all built via a prompt, right? Like don't think about it
15:44 the same way you think about putting a prompt into OpenAI or Claude. It's not
15:49 going to function the same way. There's a lot of like there is some prompting in
15:53 each of these third party tools, but at the end of the day, you're going to have
15:56 to like figure out what works, put that into, you know, their prompt builder in
16:01 whatever format they have, um, and refine it from there. And then there's
16:03 usually additional steps other than a prompt. So I think some of the tools
16:08 we've seen are kind of getting to that point. Maybe they'll get there like by
16:11 the time we're at SAS annual or maybe right after in like the second half of
16:14 the year. Um but most of them are not just font builder. So I think that's
16:18 another thing to just bear in mind like it's not if you're used to kind of like
16:23 this easy path that Claude and Chat and OpenAI have, you know, kind of
16:28 proprietary put out there like they're not all like that. They're not all that
16:31 easy. Now, I will say you can use things like chat and claude to make your
16:35 prompts better. I do that all the time. Like, I'll put whatever context I'm
16:38 putting into our agents. I'll run it through Claude. I'll run it through chat
16:42 GBT and see what I suggest to make it better. Sometimes I'll take or leave the
16:45 suggestion. Sometimes it's not, you know, the AI doesn't know your business
16:50 the same way you do sometimes. And so, I think that's just another thing to be
16:53 prepared for. So, how did we get to this? I think this will address some of
16:57 the questions as well. Yeah. How did we get to these results of, you know, 4.8
17:00 million in accounting and additional revenue 2.4 before we close one. So
17:03 about half that. And then also, you know, we're we've now crossed the 60K
17:07 mark and how many emails and interactions our AI emails have had just
17:12 in the salesunnel, right? That's not even counting the almost close to a
17:16 million we've had in our proprietary interactions with our vibe coded apps.
17:20 But that's a lot right there. >> You want you're going to explain how
17:24 this looks and given how tiny we are, it's pretty impressive numbers. But if I
17:28 had to summarize all of this and then challenge me if I'm wrong cuz you're
17:31 doing you're doing the real work, right? I think some of the key here is not that
17:36 all these emails were the best emails that have ever been sent in the history
17:39 of mankind. Um I know you think they're great. I actually just think they're
17:42 okay. I don't think they're bad. I think they're better than most of the outbound
17:46 emails I'm going to get during this day during AI day. The ones we send are
17:49 better, but they're not the best I've ever they're not something you can spend
17:53 an hour crafting. I think the number one key and that's why the 60,000 key is
17:59 cool. We are touching folks lapsed folks we forgot to talk to folks we don't talk
18:06 to enough more often we are connecting with more people more often not spam
18:11 right but we couldn't do 60,000 highquality emails manually or even with
18:16 old school outreach tools we just couldn't do it right. So, I think it's I think the key
18:22 to this Tell me if you're wrong and then I'll we'll be quiet. I think it's just
18:26 more high quality pretty good interactions. >> Yep. >> That's the thing. We're getting scale.
18:32 And that's why what I think when I think about everything we've learned and
18:34 you've done most of the work, Amelia, if I had to advise people that are earlier
18:38 on their journey, find something in your go to market motion that just isn't
18:42 getting done or is getting done very at a very mediocre level.
18:45 >> Yep. >> Then put an agent. Don't try to replace
18:50 what's working well. Do that as your 10th agent or your 20th. Like literally,
18:55 we're such a tiny team. We just weren't reaching out to enough people in our
18:59 base, in our activated base. And so that's the lowhanging fruit for us. We
19:03 just could not do sure any you could put something in a dated outreach sales off
19:08 cadence, but that don't work, right? But we never would have done this otherwise.
19:13 So my find that low hanging fruit, the stuff in your dom that you're just not
19:16 getting to. The customers that are too small, the customers that take too long
19:20 to respond and your team doesn't want to do. The customers that have low have,
19:25 you know, everyone's that have lower scores, right? That that but they're
19:29 still they still have intent, but no one wants to call them back. Do those ones
19:33 for whatever your low fruit is because then even if you get some yield is it's
19:38 it's magical. >> Yep. I agree. So I think again a little
19:44 bit of a misconception here related to some of what of the chatter is the
19:50 formula for us is to copy your best human like as you're deploying maybe
19:53 you've already deployed one agent maybe you're deploying your next agent to
19:57 Jason's point right do something that you could either get a lot of scale out
20:03 of by adding an agent and do pretty good there's just some things like the agents
20:08 can't slash shouldn't do like obviously we love agent Like I love our agents. I
20:11 use a lot of them. I add, you know, I just added the AIVP of marketing. We'll
20:15 show you guys. But there are some things I'm like the agent would just suck at
20:18 that. So I'm just not going to do it. It's just like there are some things I
20:22 still need humans to do. Like a lot of the production stuff we're doing for SAS
20:25 annual. I'm like I still need a human to do that. Dude, the Asian is not there
20:30 yet. But what we also mean by copy your human is if you're going to add AI
20:35 agents, add scale to help you scale, right? is on scaling more emails, more
20:40 more meetings, more clicks, more volume. Figure out what works first. I see too
20:45 many people who, you know, okay, they want to automatically give an AI SDR to
20:51 their SDRs. And I'm like, okay, well, are those SDRs new? Did they just join
20:54 one? I don't think it's a good idea to give it to every single SDR. I think
20:58 that's you're gonna there's a lot of reasons that would get you into trouble
21:03 fairly quickly in terms of workflows. But I but I also think if you don't know
21:08 what works first, I don't get this mindset of like, oh, it didn't work, but
21:12 I'm just going to add AI and it'll magically work now. Like, no. If it
21:16 didn't work or wasn't working before AI, it's not going to magically work now.
21:19 And somebody asked me this question yesterday when I was doing a Salesforce
21:23 webinar like, okay, what if I'm a super early stage startup and I don't know
21:27 what works? And I was like, well, do you have any customers? They're like, yeah,
21:29 we have like, you know, 10 paying customers. I'm like, "Well, go ask your
21:33 customers why they bought you." Like, everybody has at least a few customers
21:36 or maybe if you're super early stage and you've got a few folks on a trial, just
21:41 go ask them. Like, figure out what worked. Figure out what got them in to
21:44 your product and is getting them hands- on product. Figure out what works first,
21:49 right? We had so much data that we ran through before we put it into any of our
21:53 agents on what was working, right? the best the best a like the best email copy
21:58 for doing outbound the best responses of how we should follow up with inbounds
22:04 you know the best um contacts and verbiage about SAS you know about SAS's
22:09 events about sponsoring SAS like we went through all this data we went through
22:12 all this context we flagged everything that was the best of everything before
22:19 we put it into any agents any AI etc whatever so you know train the agent on
22:24 what works best. I think I see too many people now falling into the trap of they
22:28 want to add AI into something new. And sometimes you can and it it will work to
22:32 some degree, but I think if you do it and you train it on the best of
22:35 everything, it will work that much better, right? It will get you to pretty
22:38 good to Jason's point. Like it'll get you to pretty good emails. They still
22:42 may not be the best on planet Earth, but it'll still you it will I think it'll
22:46 put you over that bar of pretty good versus crappy AI emails that we've all
22:50 seen or even crappy human emails that we've all seen. So yeah, that's that's why I convinced
22:55 on it. But yeah, I think you know, you have to trade it on the best of
22:58 everything. And if you don't know what that is yet, I would take that time,
23:02 take a week, figure that out before you know you deploy your first or next
23:07 agent. So that and then see where that gets you. I feel like you'll have a
23:12 better output because we are constantly iterating our agents now to make sure
23:16 they're they have the best of everything and that they know everything that we
23:22 know like as we know it, right? So like as we get, you know, speakers for SAS or
23:26 we have new things that we're doing or now we've got like lounges and stuff or
23:29 like new things in our sales process like we I'm constantly making sure the
23:33 agent knows all this so that I can talk Okay, so I want to address some of the
23:40 questions on the chat of like you know some of folks are asking about
23:44 evaluation tools like what's our processes and then this is the 9010 rule
23:50 that Jason came up with but I I really do agree with and I think it's a good
23:56 one which is you know buy 90% of your AI stack and I'll talk about the evaluation
24:00 process we've done in a second and only build the 10% where there's where I
24:05 where I think there's you know there's no vendor that can do this Well, and
24:10 it's either a P1 prior priority or as you'll see in like our AIBPM, you know,
24:14 I built that agent cuz it was a commodity like it was something where
24:18 even with all of our agents now, I was like, I still have so much data from
24:24 Saster internally that I want to act on and I want to deploy this agent in a way
24:27 that maybe I don't need it to run everything automatically. So, it's a
24:32 very specific use case, but that was where I kind of built that and that's
24:35 where that kind of fostered in from, right? It's like, okay, I had all this
24:38 data. I wanted to do something that was more internal facing, not necessarily
24:41 external like a lot of our go to market agents are. Um, and so in that case, it
24:46 made sense to build. I'd say for a lot of things, it doesn't make sense to
24:49 build, right? Like, um, if you guys listen to the podcast Kyle and Jason
24:53 did, I think we put it up like last week or something. Kyle, who's the CRO of
24:57 owner, talks about how, you know, he's also kind of roughly followed the 9010
25:01 rule of, you know, he's bought a lot of third party agents. um he's made them
25:05 work and then he hired somebody who was like a I think it was like a former
25:09 founder or something right Jason to like build a proprietary in-house tool and
25:13 that's like one extreme right but like even that 10% that he's building
25:17 in-house like he hired somebody who like was a CEO was an engineer like knew how
25:21 to code that like I think he was like a CEO of like an LLM company or something
25:24 like knew all this crazy stuff and like could build a proprietary like internal
25:29 agent but again for a lot of things it Um, I think too just to address some of
25:38 the questions on the chat of like what what's been our criteria. Um, and we've
25:42 talked about it a little bit before, but when you're evaluating these tools for
25:46 the 90% you want to buy. Um, I think the important thing is to
25:51 one, you know, again, I don't know why I think in the age of AI people sometimes
25:55 will will throw things away because they're like, "Oh, there's this shiny
25:58 new object." I literally asked all of our all of these AI tools that we now use and
26:04 deploy for help. I was like, I need one like one, I'm going to need help. Like
26:08 I'm going to need an FD and two, let me talk to people who have used this. Like
26:12 I think I see too often folks are like, okay, it's an AI tool and so I'm not
26:16 going to ask for a customer reference. Like ask for a customer reference. I do
26:20 these all the time now. Like I try to make them as short as possible now cuz
26:24 um you know we do these webinars and stuff too, but I do these all the time
26:28 now. like Marshall from Mango Mint, Kyle from Owner, like we do this all the
26:31 time. Like Phipe from Persona, we do this all the time now. People ask us
26:34 constantly for like, you know, a customer reference. Like it's con like
26:38 ask them for a customer reference and if you can ask them for one like in your
26:42 vertical, see what they say to you, right? Like if they push back, maybe
26:46 don't use that vendor. Like they Yeah, most of these folks have
26:49 at least one customer that's slightly like yours. If they don't have one in
26:51 your vertical, maybe you can give them a pass on that. like at least talk to a
26:57 customer and then see how much they will help you, right? I think a lot of these
27:00 tools to their credit for the third party tools we do use now have been
27:04 helping us along the way, right? Like there's there's some of this like we've
27:09 learned from just now deploying so many agents, but some of it was because they
27:13 put an FTE on our success team, right? Like Salesforce put an FDA on our
27:18 success team. Artisan is unique in that, you know, anytime I have an issue or or
27:23 I have an idea, I just, you know, the CEO or the head of product, you know,
27:28 qualified. There's an FTE on our success team. Oh, you know, Replet, we have an
27:33 FD on our success team. There's just so many cases here where if you ask them
27:37 for that, they should give you some level of that service, right? Like to
27:42 make it work cuz they should want your business and they should want you to be
27:44 successful. Now, it doesn't mean that you need to have an FTE like every week.
27:48 Like now I meet with them a lot less often than getting started, right? But
27:51 you should ask them at the very start at the very least to have some FTE time at
27:59 >> Say one thing on tools. I know this is versions of things we've said since the
28:03 beginning. When you're talking to a vendor, >> if it doesn't feel right, don't buy it.
28:07 >> Yeah, >> it should feel right. It should feel
28:11 a lot of folks flame me a little bit when I say a lot of agents should almost
28:17 get you going for free, right? And a lot of the agents can't do that. There's
28:20 economic reasons. There's headcount limits. People can't really train you
28:23 and deploy you for free. >> But if you look at like um the 20VC that
28:29 I did with Harry and Rory when Mark Beni off came on, it was interesting when he
28:32 said he wished he could. He said he can't at Salesforce, but he wished he
28:36 had enough FDEs that everyone could be in production on Agent Force before they
28:41 had to pay. It's not practical, but the best ones take you as far down that
28:45 journey in the Age of AI as they can. The pro they're proud of their products.
28:48 They'll show it to you. If something doesn't smell right, if it doesn't feel
28:51 right, if you don't think it's going to work, it won't work. Buy another one.
28:56 Even if the brand's less good, even if it's scrappier, even if whatever, if it
29:01 smell, if it doesn't, if your Spidey scent says this agent isn't going to
29:03 work, don't buy it. [snorts] >> Yep. I agree. I think too, like um yeah,
29:09 that's a that's a the point you made on the free trial that like a lot of agents
29:13 cannot set you up for free. That's a really good point in the evaluation. So,
29:18 yeah, we you know, we threw down for these agents. >> It makes it hard. It makes it hard to
29:22 take some risk. It is interesting. I want to say it is interesting that when you look at the
29:28 proumer AI tools that we highlight, all all all the Reeves and the gamas,
29:32 they're lucky because you can get so much value for free. Even forget about
29:36 29 bucks a month or 99 bucks, actually the free products are great. Like try
29:39 those tools. The problem with AIGM tools is like even if they want to do it, they can't do it,
29:45 right? So, you've got to take some risk, but um maybe not later in the year,
29:52 but you know, don't do it if it doesn't feel right. [snorts]
29:55 >> All right, that's our kind of like build versus buy rule. And then once you get
29:59 to this point of the process, like um something I wanted to address, which is
30:03 also the title of this our talk today was what does that look like in reality
30:07 once you get into multiple agents, right? And I'm going to say something
30:12 today that it's not so simple. Don't let that scare you. Don't let that like
30:15 frighten you off of doing more than one agent and maybe you stick to one and it
30:18 works really well and that's fine. Like you do not everybody needs to be I think
30:22 on a [clears throat] multi- aent management journey but just know that if
30:26 you are you if you're in that journey today for ourselves and what I've heard
30:32 from some others is it's kind of all bandated together. [laughter]
30:36 There's like a you know there's kind of a big reason folks like you know
30:39 Salesforce are having like a big renaissance like because a lot of these
30:43 thirdparty tools we use for instance and for example like push back to Salesforce
30:46 or we push all the data back to Salesforce with like a zap or you know
30:50 whatever or some of them have a native that they can push records and update
30:55 records back to Salesforce and so a lot of the time right now it looks like you
30:59 know all of our third party tools whether we're a APIing into them or not
31:03 or using things like a Zap year then we have all of our internal data our VIP
31:07 ABS, right? You're we're pushing all that back into things like, you know,
31:12 cloud, Zapier, back into things like Salesforce as our like system of record
31:17 just to keep all the records up to date somewhere central. But, you know, that's
31:23 not native now, right, for now. That's not native at this moment. And so, it
31:28 takes a lot of web hooks. If you haven't heard this word, you'll probably learn
31:33 it fast. We have so many web hooks into our Zap year account. I can't even like
31:36 count them, right? Like we have so many web hooks just firing all the time to
31:40 like push things back, but I'm pushing them again as into one kind of thing.
31:45 And for now, that's like Salesforce cuz it's it can ingest all this data and
31:49 take all the context for our agents. Um, but and not to say like you could say,
31:53 okay, I don't maybe need that data everywhere all at once. Um, but I I like
31:58 to have it. I like to, you know, I like to build the context of the agents from
32:03 one agent to another. Um, and to sort of let it build on itself, we use a lot of
32:07 web hooks. You know, we use Zappier. I know N is having like a renaissance now
32:11 cuz it's kind of the same thing, but just built in the age of AI. Um, but
32:16 whatever one you use, right, you're going to you're going to see quickly.
32:19 And I've got a screenshot of it. You end up with a lot of like different hooks
32:23 and kind of like hodge podgeing things together. Um, but I think it's just for
32:27 now, right? I don't think that's a problem for always. I think it's just a
32:32 problem for now, you know, in the first half of 2026 to to, you know, have it kind of web
32:39 hooked into things that you need to make sure you can control the flow of what
32:43 your agents are are doing and where that data is ultimately pushing back to and
32:47 pulling from. I do think you should pick one source of truth, right, at the end
32:52 of the day to store some of this and then build further context for your
32:55 agents. you know, we put Salesforce, you could pick up spot or something else.
32:59 I think also to get used to your agents talking to each other on their own, you
33:05 know, it happens. Our agents talk to one another. It's fine. Get used to also as
33:11 a human like talking to your agents. Um, it is kind of a weird thing to at first
33:14 get used to and then you'll get used to it. And then also get used to, you know,
33:19 for now copy past basing context. Like we do a lot of context sharing between
33:21 our agents. like, yeah, some of this pushes to Salesforce. But sometimes I'm
33:25 like, you know what? I don't want it to push through that flow. I'm just going
33:27 to copy paste something in this context from one agent and then put in the other
33:30 agent like the way that it it understands context. And so, again,
33:35 that's not necessarily the simplest or the cleanest path um of multi- aent
33:40 management. And so, I just wanted to be for real about that, that in today's
33:44 world, that's what our reality looks like. But that's also because,
33:49 you know, we use a lot of specialized tools. Like there are obviously I know
33:52 there's like all-in-one agent builders out there. Some of them are some of them
33:57 are coming to us or dismay. But for us like you know I like to use the
34:01 specialized tools. I just still find that the output is a little bit better.
34:05 Like I like to I like to use the best of everything in each agent versus like an
34:09 all-in-one tool um that can build multiple agents. I just for us it works
34:14 better for you. you might see success in using an all-in-one tool, but for you
34:18 know that could build different agents across the board. But for us, since we
34:22 use like very specialized third party agents, this is like the reality we live
34:26 in. But you might not live in it if you pick one system that can do multiple
34:30 agents, you might just have to manage one from there. If you're like, okay,
34:33 you know, I'll trade off maybe some of the quality for quality of life and
34:39 managing all the agents, then then it [snorts] All right. So, what do I mean
34:47 by this? In reality, right, this is a screenshot of one of one.
34:54 This is a screenshot of one of my zaps. I'll explain to you what's slightly
34:58 happening here because this is a good I also wanted to show people like a go to
35:02 market flow they could copy um maybe not necessarily at the same like degree or
35:05 scale but this is one you could feasibly copy slash iterate on for yourselves
35:09 right once you get to multiple agents so you'll see it's catching a web hook
35:15 I think this web hook is s annual if I remember which one I screenshotted um I think this one is
35:22 annual it's catching a web hook because there's like a you know there's a lot of
35:26 forms on our website. Um, and we vibe coded the website. And so it's got a web
35:31 hook when you fill out the form. Um, and so anyways, it's catching this.
35:35 Basically, a web hook is a listening tool. If you don't know what a web hook,
35:39 like it's listening to say, okay, um, in this case, when you submit a form, the
35:44 web hook is going to catch it anytime it has a submission and then tell me what
35:47 to do with that hook, right? So, it's basically capturing that data. So, it's
35:52 catching the hook. It's porting that submission one to a Google sheet cuz I'm
35:56 crazy and I just like backups of everything also in Google Sheets. Like
35:59 again, you'll see like you literally seen in this flow it's going to
36:02 Salesforce, but I also just Yeah, just sometimes I need a quick Google sheet.
36:07 Sometimes it's just nice. So, it's pushing to Google Sheets. You'll see
36:11 it's pushing to Salesforce. So, um you could do this on contact or lead. Um,
36:15 you know, it also depends on how like we're in the flow where we have agent
36:20 force and so, um, it's it's ours is triggered off contacts. You can trigger
36:23 yours off leads. Ours is triggered off contacts and so it's creating, you know,
36:28 a contact in Salesforce. It's adding a contact to a campaign. Now, in number
36:32 four, I circled it because I said, you know, we can pick when it adds a contact
36:36 to campaign if we want to send it to agent force already in this app, right?
36:39 cuz I have certain campaign triggers that say, "Okay, when they're added to
36:43 this campaign, trigger the agent to turn on." So again, you don't necessarily
36:46 need to do that if you're not ready for that yet, but it's something you could
36:50 do here feasibly, easily um and do it a little bit more automated, right?
36:55 Then you know, it's going to um find those records of those find those
36:59 records of, you know, the company. Um there's a little this is a little
37:03 misleading because it sounds simple, but it's finding the company records, right?
37:07 So since this is a contact level contact that it's created um and triggering to
37:11 agent force potentially now it's going to find records of okay basically I'm
37:16 asking Salesforce to see what is this company on the account level because we
37:21 use account level contact uh records what has this company done with us and
37:24 so I want it to find those records of what that company has done with us and
37:27 then you know I want it to get the record attachments if you use clay you
37:33 can use it here in a very kind of fun way to say, okay, if I already have a
37:38 table in clay, you can have it like summarized for you and then also like
37:42 look at LinkedIn and say, okay, what else is this person actually also doing
37:46 on LinkedIn? What are they doing? What are they posting on social media, for
37:49 example? So again, you can get more context. You could skip this step if
37:56 you're like not into using a clay table, but that's a fun way you could do it
37:59 there. And then you can send a Slack channel message to send you all this
38:03 send you all this context of like, okay, here's the, you know, here's the contact that I just
38:09 added to the campaign. Here's the account information about it. Here's
38:14 the, you know, clay contact about it. And then I'll send you a Slack about it.
38:18 And then if you really want to, you could do things like make a gamma. Like
38:22 if you wanted to make either a landing page or a presentation for this person
38:26 to send in their email about, you know, let's say how to use Gamma at Saster or
38:31 whatever, like how to use whatever your company is for Saster. You could do a
38:36 super complex flow like that. Have it make you a draft presentation or landing
38:41 page to send to you. And then, you know, in Gmail, you could create a draft
38:44 ultimately send to this person if you want to do it that way. Again, this is
38:47 just a sample go to market flow. You can see I didn't like fully set up my clay table because
38:53 I'm just speeding through this. But again, this is a good sample go to
38:57 market flow. You'll see it's like, you know, it's got agents kind of layered in
39:01 it. There's like an agent force layer in it. There's a, you know, if you consider
39:04 a clay agent, there's a clay agent in there. You know, this one pushes to
39:09 Gmail, but if you have an AISDR email platform, you might want it to
39:12 push to that platform. But, you know, all that I think is just
39:18 important to see as an example. >> in this multi- agent management sample
39:25 flow. Right? Again, this is just a sample flow of how you can feasibly kind
39:31 of manage agents, which right now for us is somewhat messy, but it looks a lot
39:35 like these Zapier flows. It's a lot of Zapier to Salesforce to other things to
39:42 APIs to whatever. And so yours may or may not look like this. I think a lot of
39:45 times folks will be like, "Oh, you guys have 20 agents. Like, who are you using
39:49 as your MCP?" I'm like, "We don't have one." Like, we don't have a true like I
39:53 don't consider this Zapier or Salesforce thing a real MCP. I consider it a MCP light,
40:02 but like it's not like if you truly look up what an MCP is, it's like it's not a
40:07 true MCP. like yes like the context is sharing back and forth and you can kind
40:11 of get there on Zapier and Salesforce but I again I call it light MCP in air
40:18 quotes cuz it's not really an MCB and so many people have been asking me that
40:21 lately cuz they've seen you know all of our content or our agents are like yeah
40:24 you know what do you recommend I use for my MCP I'm like I'm I'm not using one
40:31 truly like this is my MCP it's a lot of human work so [laughter]
40:37 again this may not be your use case, but this is how we've done it. Okay. Uh I
40:42 just want to deep dive into two quick things because I feel like there are um
40:47 a few related questions to it. So I have a few deep dive slides on the AISDR and
40:52 then a few deep dives on our AI VPM that I'll just quickly touch on and then if
40:55 you guys like this content I can go fully I don't know I could do more
41:00 questions at another time um on another Wednesday that's not AI day. But yeah,
41:05 on a quick deep dive, I think um things to keep in mind if you're because a lot
41:09 of you in the chat seem to be rolling out like your first AISDR. Now, I think
41:15 a few tips and tricks just agnostic of any tool that you use. I feel like this
41:20 is good um hopefully good advice across the board regardless of what tool you're
41:24 using, which is one to treat each outbound segment dynamically. And what I
41:28 mean by that is like even across our you know multiple agents we have for AI go
41:33 to market. I don't do like I see people do one campaign for like 10,000 leads.
41:40 I'm like no I max my campaign to like 100 500. Like I want each campaign each
41:47 sub agent to be highly customized highly trained to the exact segment that it's
41:52 going after. not like a broad hey have you heard about disaster like no I want
41:56 to say okay these are my outbound segments I put a chart here on the right
42:01 that I made um for our outbound AISDR funnel hopefully it's helpful but I
42:05 treat each of these dynamically and I train each sub agent dynamically on each
42:10 of these things so that the output to Jason's earlier point is pretty good
42:14 right okay maybe it's not great but at least it's pretty good because
42:18 everything is tailored the audience is hyper segmented the messaging is hyper
42:22 segmented. The training is hyper segmented. Hyper segmentation in the age
42:27 of AI with these agents is your friend. Don't go don't spray and pray, please.
42:31 Like don't do that with your agents. I see a lot of people do that. It's that's
42:36 how you get the bad emails off the AISRs. You know, another another way to
42:41 think about it too is is to not think about it in the human ways of
42:45 segmentation, right? Like a lot of times um classic outbound would be, okay, I'm
42:48 going to do it on the geo of where they're based. I'm going to do it on
42:50 their title. I'm gonna do it on their roll. You can see on my chart, none of
42:55 that exists here. Like I'm not doing any of that super highlevel almost
43:00 artificial segmenting. We do it hyper segmented. And the reason we do this is
43:05 to as I bolded it here, give your agent context. Right? If you're already used
43:10 to using chat and claude, what you're doing with those agents every day is
43:14 talking to it, giving it context, telling it about your business. That's
43:18 the same thing you have to do for these AI go-to market SDR agents. You have to
43:23 give your agent context and the more context you give it, the better the
43:29 result will be. And so that's why I hyper segment everything list,
43:34 messaging, targeting, etc. All hyper segmented to the AI STRs. And that's
43:40 across all of our AIDR agents, right? You have to give your agent context for
43:45 it to understand who are you trying to reach out to. What are their specific
43:49 pain points that your problem and your tool can solve? And then I'm going to
43:53 use, you know, my classic AI of I can scrape the internet, see what their
43:57 company is doing, and relate it back to them. And so you'll see in my outbound
44:03 AI SDR funnel, none of this is like cold leads and none of it is like geo or
44:08 title or location. And I think too this good lit I don't know how long this list is. 12 things
44:15 like for most of you start here. Like start here with your AI SDRs. Too many
44:21 folks I see now are doing AISDRs. I'm just going to let it loose on cold
44:24 outbound because that's what our human SDRs don't want to do. I understand your
44:27 human SDRs don't want to do cold outbound to people who don't owe you,
44:31 but neither does your AI agent because your AI agent does not have context for
44:35 why you should be reaching out to this person. So, same rules apply here in
44:42 outbound AI SDRs, you know, start with start with the hot people, the people on
44:45 your website. Um, a lot of these AI agent tools can deanmize some of your
44:49 website traffic to email them, people who have inbounded to you. If you have
44:53 like abandoned carts or trials or you have event leads, start with all the hot
44:57 people. Do the people who like, you know, was a customer, maybe they changed
45:01 jobs. Do current customers like we get this all the time. I'm like I email
45:04 people who like bought a ticket for lemon to come to Sster annual in May in
45:08 NASA and I email sponsors that are like current customers to be like hey we
45:14 added a bunch of new stuff. I think too many folks kind of skip using AI for
45:19 expansion but it's a great way to do it. You know if you have recent marketing
45:22 leads because you're doing something like a webinar like this or you've
45:25 gotten ebooks or gated content or you spent some money on some sponsored media
45:29 and you got some leads. Put those people onto the agent. leads we never followed
45:33 up with that we famously gave to agent force. Again, the list goes on. You
45:36 could see what I mean hopefully here. Like, there's so many hyper segments you
45:41 can give your agent before you give it a quote unquote like cold lead that knows
45:44 nothing about you that you should start here. And a lot of the reasons why you
45:49 should start here is not only will it give your agent context, it will give
45:52 your human team context on what works and what doesn't. So that by the time
45:56 maybe you exhaust this list, I still haven't exhausted this list after 8
45:59 months, but maybe you start to dwindle down this list because you don't have as
46:04 many contacts. Then you can start to do, you know, okay, the then you can start
46:09 to do the AI truly cold outbound to folks who maybe don't know you. But at
46:12 that point, you're using what worked. Again, this all goes back to what works,
46:16 right? Like at that point, you you know how to train your AI agent. You kind of
46:20 know what's worked for these audiences. then you can you can make a very
46:23 informed guess on what would work for a All right, hopefully that's helpful. I
46:30 think the other quick thing just across the board and then I'll go into our um
46:36 AIVPM that we built and try and do some quick questions is um you know AI is
46:41 great because it can adjust then everything right we have it you again
46:46 ingest the best of everything your best case studies your best everything right
46:53 but also tell it what you can't do and I think this is a super important nuance
46:56 that I've only learned after 8 months now I used to just be like okay here's
47:01 the best of everything. Super good to stay in these stay in these boundaries.
47:07 And then over time, because AI is the agents are so self-gratifying, it's
47:09 trying to beat itself, right? It's like, "Hey, Ameilia, I did pretty good." And
47:13 so now I'm going to start to maybe either make stuff up or try and beat
47:19 myself with my opens, clicks, meetings, and I'm going to start to say things
47:24 that maybe you didn't put into the context of the agent. And so I quickly
47:27 learned a couple months in actually that once you start to do this at scale, it's
47:35 maybe just as important to tell your AI agents what you can't do and what you
47:41 can't do. Like I have now told it, you know, okay, we don't do that or we don't
47:45 do this or we don't do that. Uh you know, we don't don't offer people like a
47:49 speaking slot. Like yeah, we have speakers at SAS, but a lot of people
47:54 apply to speak. Send them to, you know, the the content committee submission for
47:59 like do that instead. So I think that's just an important nuance
48:02 I've learned over time. So hopefully that's all for for you guys to know now.
48:07 Hopefully earlier in your journey that I kind of learned it the hard way cuz it
48:11 sent some it sent some emails it shouldn't have of things that like we didn't do and I
48:17 realized it was because I didn't tell it that we couldn't do those things, right?
48:20 Like it was just ambitious like in the way that maybe a human SDR would be like
48:24 oh I don't know I think we could do that or oh I think that's on the road map
48:29 classic, right? And so the AI agent did a little bit of that. And so I think
48:33 it's important to now to say, okay, here's what we can do. Here's what we
48:40 [snorts] Okay. Uh this is a little context, but I want to go through I'm going to upload
48:47 these slides for everyone. So just ask.com. So don't sweat it. I'll also
48:51 send it to you. We still reply to everything. Um maybe just the last uh
48:57 tidbit on AISR agents is you know if you have found bad foundations and what I
49:02 mean by that is bad context that's where you'll see bad emails right bad context
49:07 equals bad emails honestly this bad email I put on here I actually think a
49:10 human wrote to be honest it was written in a way that I actually don't think a
49:15 AI wrote it but it was written in a way where like they're just bad these are
49:22 truly bad but I have I have seen AI STR emails that are of this quality.
49:27 I do think these are two human emails cuz this person didn't actually know
49:30 where I worked. And I was like AI would have gotten right where I work. So I
49:36 think human wrote that cuz that seems like a very basic mistake that an AI
49:41 would not make. So um I think that one's a little funny. But
49:50 anyways, uh yeah. So, this is one where like again uh oh, I guess I didn't put
49:54 the screenshot of what Sorry, there was a I meant to add another screenshot
49:57 where they got the company that I worked for wrong and I was like that's not an
50:01 AI, that's a human. Uh, but I'll add it and then I'll show the slides. But, you
50:04 know, this other one of like, okay, again, this person wrote this. I'm
50:07 pretty sure they wrote this as a human. Maybe I'm wrong, but they just did it,
50:11 you know, based on again things I never segment for in an AISDR. They did a
50:16 based on go of like where the office is in for Saster. They did a base on my
50:22 role at Saster. But clearly, again, I think a human wrote this did not look
50:25 anything up because they wanted me to use their tool where I already I I
50:29 literally mentioned that tool on this call. So I was like, well, like at least
50:34 reference that or like be acknowledgeable that like I'm already
50:38 using a different AISDR that would have like, you know, told me that at least
50:43 you listen to something or your LLM listen to something I did that knew that
50:47 I used this product. But like just saying like are you thinking about using
50:50 you know what are you what are you what are your priorities for 2026? I'm like
50:55 dude this is such Anyways, this is a bad email. Bad foundations, bad context
51:01 equals bad emails. Um, but also, you know, there's still plenty. Oh, here's
51:04 the other one. Yeah, this is the other one that somebody sent me. um that we
51:10 yeah again I think a human wrote this not a person because they got the
51:13 company wrong and I think an AI would get the company right but clearly on
51:16 mother's webinar talking about Sasker so I don't work at Forester and never have
51:21 I ever worked at Forester so I don't think a I think a human wrote this and
51:27 just copy pasted um and again not an AI because AI you know I think 100% of the
51:33 time or maybe 99.9% of the time our AI agents know where you work like it's
51:36 pretty uh like maybe if I had worked there previously I would give it a pass
51:40 but never ever worked there so I don't give that one a pass. All right, in the
51:45 last few minutes our newest AI agent that we built and why we built it and
51:48 then you know we could do a follow-up to this cuz 5 minutes will not do it
51:54 justice but we did not find a viable third-party marketing agent that could
51:59 do more than content. Right? A lot of the marketing tools out there for true
52:03 go to market do a lot of content related activities. The real problem we had was
52:08 orchestration, you know, based on data, based on already having other agents, based on
52:13 having, you know, proprietary agents, whatever. Like I had a need to build an
52:20 agent and I also knew we had a track record where like anytime I try to
52:24 onboard an actual human with all this data, they got overwhelmed, right? And
52:29 so I was like, okay, what can I do now, knowing what I know now, 8 months in, to
52:34 really capitalize on getting an agent to work that could, you know, push us, keep
52:39 us on track. And then ultimately what our AIBM does is actually tell me what
52:44 to do. >> Just like most CMOs, >> I know >> they don't actually do the work. They
52:50 just tell everybody what to do. That's the dream job. >> That's the dream job. But the difference
52:56 on my agent is it at least it uses data to give me what to do. Right.
53:00 >> I see. That's what they did five years ago at their last one.
53:03 >> Not just like, hey, I used this playbook at my last company. I'm going to do it
53:07 here and bring in an agency and like a bunch of people. At least my agent was
53:12 like honest about, hey, here's the data. Here's where I think you're falling
53:14 short. Here's where you should double down. Here's where you should spend
53:17 more. Here's where you should hire a person. like it literally gave me all
53:22 this output which was quite nice. So yeah that's a little funny that's a good comparison.
53:29 So this is a quick slide on how we did it. I took a bunch of data from our from
53:34 our agents from our third party tools from internal data that we've had over
53:37 the not all of our data just some because it's a lot. So I I I
53:40 cherrypicked kind of some of the best data because that I wanted it to action
53:44 on you know I looked at our our Zapier workflows I looked at you know
53:47 Salesforce I took all of it. I pushed it into um Claude just for purposes of this. Um, and then
53:57 I I took that I took what I did in cloud and I pushed it into Rubble just so I
54:00 can make it into a website that the rest of the team could access because I was
54:05 like, okay, obviously my cloud is for me and I I don't that's not like a good
54:09 maybe for good reasons. You know, there's not good team sharing on like
54:13 specific chats and so I I pushed interrupt so I could make websites to
54:16 share with Jason and David and then like some of our production team at Aster. So
54:20 we built our own. And we nicknamed it 10K for a lot of reasons, but at the end
54:25 of the day, how I built this custom agent was I already had something in
54:31 mind of what I wanted it to do. And so in that, I had a very clear goal in mind of like I
54:38 wanted to go to the first 10,000 the first 10,000 attendees for SAS annual in
54:42 May and the first 10, you know, 10 million of revenue for this year. So I
54:48 gave it very clear goals when I built this agent. I gave a very related like
54:53 context and data that related just to those two goals and those two things.
55:00 And basically what the architecture was was it used a lot of claude um opus
55:06 right which is like kind of the I will say I had to upgrade to max to use cuz
55:11 my pro account ran out of memory and it [snorts] did take me a weekend. This was
55:15 over a weekend I did this. Um, I had to upgrade to max, which now I love, but it
55:19 at one point it was like, you know, I was on the pro plan. It was like, you're
55:23 out of memory. Please wait until three. Please wait until 7. I was like, okay,
55:27 I'm just going to upgrade. I had upgrade that cuz I was using LM a
55:31 lot. But I had to, you know, analyze all these things, all the emails, the data,
55:36 what's worked year over year, the registration patterns, the time of day,
55:39 and like when do people buy a ticket to sales manual, when do people buy a sponsorship? Um,
55:45 all all plus all of our recent again all of our recent agent interactions. I put
55:49 I shouldn't say all. I put some of the recent agent interactions in there so we
55:53 could see how agents and people were interacting with SAS. I felt like that
55:57 was an important context to give it as our AIVBO marketing. Um, and then you
56:03 know I I I told it to give me an analysis of you know the next six
56:07 months. Give me a road map of everything we should be doing. And I said, "Give me
56:13 high level and then give me uh full details." And I'll show you that
56:17 in the next slide. I was like, I need every single marketing initiative at
56:23 again a high level and an actual daily executable task. And I told it to give
56:26 me that, right? I think this is super important so that you don't just get a
56:30 bunch of like generic strategy ideas. Um I told it I wanted an executable task. I
56:34 wanted them ground in the data and I wanted it to be easy enough to follow so
56:39 that we could the humans here the three plus one dog could still execute it. And
56:45 so that's where again I think a lot of this just was a culmination
56:51 of our using agents and I kind of knew what I wanted and I knew what context to
56:55 give it. I knew what data to give it. Um I don't recommend just building your own
57:00 AIVPM today. um you know to other agents first. But it's interesting because a
57:05 lot of stuff I was doing it said to blow up or abandon and then some stuff it
57:09 said to bring back and then there was a bunch of new stuff it told me to do. So
57:13 again since it was based in data I'm trusting it on what to do and how to run
57:18 these campaigns. And so you'll see here this is kind of again you'll see at a
57:21 high level it's giving me a week over game plan for uh cumulative tickets. So
57:25 this is important that's and I told her this would be cumulative to everything
57:29 else we're already doing at Saster but just give me cumulative ideas of what we
57:32 can do to you know get you know maybe instead of have 10,000 we have 12,000 or
57:37 15,000 people it's like just give me a cumulative ideas to get you know couple
57:42 thousand more folks to tapster. Um and so this was the game plan it came up
57:46 with. So you can see, you know, in the early weeks, it's like, okay, you can do
57:49 some early bird stuff in January, you can do some alumni stuff. And then when
57:52 you click each of these, it has literally what you should be doing, the
57:57 email, the it knows I'm using an AISR, it knows some of our AIS, like again, I
58:01 do have a context writing sent. This is what to do with qualified. This is what
58:04 to do with artisan. This is what to do with 184. This is what to tell Jason to
58:09 do on his social media. It literally gave me all these. It told me how much
58:13 to spend on LinkedIn, what the LinkedIn ad should be. like it is that granular
58:17 level of yeah you can see on the left it's high level but then it's also super
58:22 granular and so I think this is where it's been a important like back and
58:28 forth between us of just seeing you know what works and what can be done on an AI
58:34 stealer I think it's important too of like what you know 10K as we nickname
58:40 him can do a lot there are some things though it can't do right so like because I
58:49 built this as an internal agent because I built this as an internal
58:52 agent today I don't have it hooked up to these tools directly right now you can imagine
58:59 a world maybe in the late half of 2026 where you're like okay AIBPM is
59:04 connected either via zipper or something else to you know link in and it can
59:08 start to draft the ads for you or it can start to you know draft like the email
59:12 copy for you like today it's still is not doing that level of
59:19 augmentation and automation like that coming up with the ideas and it's
59:22 tracking you know it's tracking daily like I literally talk to 10K every day
59:25 of like hey where are we at today what should we be doing today where we may be
59:29 following behind cuz I'm a human now I'm running out of time
59:34 and so again it's it's not running all of our teammates for us and we're still
59:37 doing that ourselves but again it gave us really good data on what to do every
59:42 day it keeps us focused right the other thing I'll But it's not always right or
59:48 wrong. Like I literally challenge it on some things where it was like, "Oh, I
59:51 think you should run this campaign that was like, for example, it gave me a
59:54 campaign to run for January, end of month." I was like, "I don't really like
59:57 that one." Like, it's not very urgent. Like to me, I would not click on that
60:01 campaign, so why would other people? And so, I kind of challenged our AIVPM on
60:05 doing something else for this week. And then I agreed it like looked at the data
60:08 and looked at my points and it was like, "No, you're right. We should change it."
60:12 So, so it's not always right, but it's not always wrong. Right. I think it's a
60:16 good again to have that wherever you want to call it, human orchestration or
60:21 human in the loop um to say that it's, you know, to to check in with it.
60:27 It does. You know, I will say the biggest thing 10K has done is one, it's
60:31 keeping me extremely organized in this one very particular vector which for me
60:36 because I manage so many agents and I still do a lot of like goals, production
60:41 goals, it does keep me honor on what I should be doing and focusing on each
60:44 day. So for me, I live that and um I don't mind actually that 10 kitty has
60:48 told me what to do. Okay, it's become more of a conversation
60:52 now, but I don't mind that it's come up with like what I should be doing. I'm
60:55 like sometimes I'm out of energy like you tell me based on the data what we
60:59 should be doing and where we shouldn't be doubling down etc. Sometimes I'm too hard to think. So
61:06 I don't mind that. You might mind that. You might be like I don't know I want
61:08 that or not. >> I actually don't mind it because again
61:13 it's rooted in. It keeps me honest and We'll do we'll demo this in a in an AI
61:26 workshop Wednesday coming up. I think for the future I think our learning is
61:31 listen that AI marketing tools no matter what vendors say are not nearly as
61:34 mature as the sales tools which are not nearly as mature as coding or support
61:38 tools, right? They're earlier. So we had to build our whole AI to map out all of
61:42 our marketing initiatives for the year to hit our goals. And it's not ready yet
61:46 to automatically integrate with all of the other tools. Although other tools
61:49 you saw from clay to artisan, it should integrate with all of them natively one
61:54 way or the other. It doesn't yet, but that is something I think we we will
61:58 explore as a community and as a group. And I think to me's point, I think by
62:01 the second half of the year, this will all work. Like instead of us having to
62:06 build our own AIVPM and it being siloed, it will all connect and there won't be
62:10 any excuses for shooting from the hip and marketing anymore in B2B. Like it
62:14 will do all the work for you. >> That's what I'm excited about. So, but
62:17 we'll share our journey and we'll dig into this. We'll do a whole session on
62:21 10K and what works and what doesn't in the coming weeks. >> Yep. All right. With that, yeah, we'll
62:28 do a followup specifically for this cuz I know I like breezed through
62:36 AISDR stuff and the AIVPM. So, we can do a followup by the next. I know we didn't
62:40 get to all the questions. I'm so sorry. There's a lot of good ones. Thanks for
62:44 joining. Hope this was helpful. And we will see you guys. I'm literally going
62:46 to the next session. So, I'll see you in the next session. >> Hey, Sasker. Imagine having agents for
62:56 every support task. One that triages tickets, another that catches
63:00 duplicates, one that spots turn risk. That'd be pretty amazing, [music] right?
63:03 Happy Fox just made it real with autopilot. These pre-built AI agents
63:07 deploy in about 60 seconds and run for as low as 2 cents per successful action.
63:12 All of it sits inside the Happy Fox omni channel AI first. support [music] stack,
63:17 chatbot, co-pilot, and autopilot working as one. Check them out at