The Quick Guide to A/B Testing Your Outbound Campaigns

The Quick Guide to A/B Testing Your Outbound Campaigns

When trying to come up with your next outbound campaign, it’s all too easy to rely on your gut and intuition. Sometimes, it’s not even our own intuition we rely on; we search online for “Best sales email ever,” copy it word-for-word, then wonder why we’re not getting the same amazing results ABC Ltd had. 

The truth is, when creating an outbound campaign, you shouldn’t ask what works for everyone else. Instead, you need to find out what works for you. You’re unique (you beautiful snowflake you), and trying to copy the successful campaigns of others is no guarantee they’ll work for you. 

Fortunately, there’s a solution. A/B testing is a fantastic way to find out what exactly works best for your business, rather than relying on wishy-washy ‘gut feeling’ and ‘best practice.’ But what exactly is A/B testing? Why should you use it? What’s the best way to get started? Read on for our quick start guide to get your A/B testing up and running ASAP.

What exactly is A/B testing anyway?

A/B testing refers to comparing two different versions of something, typically with one variable changed, to see which one is better. This has made it especially popular in sales and marketing, where it’s used to test different approaches and find the best performing variation. 

Testing is nothing new in advertising; direct mail campaigns have been testing different options for over a hundred years. However, the internet has opened up a wealth of opportunity for A/B testing on a scale that would have been impossible before. Now businesses of all sizes can run A/B tests on virtually anything they can imagine.

It’s also possible to test more than just two variables, in which case you would have an A/B/C or A/B/C/D test or so on (commonly referred to as an A/B/n test).

Properly carried out, A/B testing allows you to confidently determine the best version of your outbound campaign, unique to your business. As an added bonus, you’ll learn more about your prospects. Remember, things change! What worked yesterday, even for your company, may be tomorrow’s tired and overdone strategy.

Only by regularly testing can you be sure you’re getting the best results possible.

Carrying out reliable A/B tests

While it sounds simple, there are some additional points to consider if you want to get the most from your tests. 

For example, I said that A/B testing helps you find the better version. What ‘better’ looks like for you has to be clearly defined, before you start any testing. You need to know exactly what you’re trying to achieve. For example, what if email A results in fewer sales, but prospects who respond to email B have a higher lifetime value? Which one’s the winner? 

It’s also important to carry out the test properly and analyze the results correctly. That means the audience for each variation has to be randomized. For example, if your ‘gut’ told you that email A was better than B, you might send A to your best prospects, skewing the results. 

Want to dig deeper? For a truly reliable test, you also need to take into account statistical significance and confidence levels. That might sound complicated, but it’s not really. If I flip a coin once, and it comes up heads, heads comes up 100% of the time. However, common sense tells you you need to flip the coin more times to trust the result. It’s the same with your A/B tests. To have confidence in the results, you need to run a test enough times to be sure the result is reliable, and not a fluke or due to an unrelated/uncontrolled variation. If you’re not sure, you can use an A/B test calculator to determine how many tests you’ll need to carry out in order to have confidence in the results. 

Note: I’m a big fan of reliable A/B tests, but it’s not always practical to establish statistical significance and confidence levels, especially when it comes to email. By the time you have sent enough emails to have that statistical significance, you’re probably already onto the next campaign. That’s why you can start A/B testing your emails in Reply immediately. Just be aware that the bigger the sample size, the more reliable the results will be.

Before you get started

A/B testing is one of those subjects where it’s easy to get drawn in and spend forever constructing the perfect test. However, before you go back to uni to study data science, it’s also important to remember the whole point of the A/B test in the first place. In our case, that’s optimizing our outbound campaigns

It’s important to start with a solid understanding of sales sequences. What kind of approaches typically work for companies like yours? While it’s no excuse for not testing, a good foundation will give you the best start to your campaign, along with an understanding of the different components to a successful campaign that you might want to test. If you aren’t familiar with outbound outreach, read some blog posts from industry leaders, as well as books on sales development to get a high-level understanding of how to create high-performing outbound sequences.

If you feel like you have that high-level understanding, but don’t feel like an expert, that’s okay. There is no need to understand everything from A to Z before you get started. The best idea is to start your sequence as soon as possible. That way you’ll be able to start collecting more info about open rates, reply rates, interested rates, who responds better, etc. Having that data is crucial for success.

A/B testing, step by step

Hopefully, the idea of A/B testing doesn’t seem too daunting now. By breaking it down into the following individual steps, you can quickly get started.

Decide on your goal

As we’ve mentioned, it’s not enough to go into your A/B test with nothing more than curiosity. You need a clearly specified goal, otherwise, it’s all too easy to get caught up in the different metrics available. So for an outbound email campaign, you might want to start with positive reply rate, meetings booked, or a similar outcome-based metric

Pick a variable to test

With your desired outcome in mind, it’s time to choose what you’re going to test. It’s usually best to pick just one variable to change (multivariate testing is possible, but generally best done when you’ve already experimented with changing single variables and are ready for more complex tests). At this point I recommend coming up with a hypothesis, connecting your variable and goal to help target your testing (e.g. “I predict that this variable change will encourage (reaction) and cause (result)”).

Here are some examples of variables you might want to consider testing.

  1. Subject lines – Subject lines have a lot of work to do in a short amount of space. If they aren’t good enough, your prospect may never even see your email. Does a question outperform a statement? Do emojis hurt or help your open rate? Should you personalize with the recipient’s name or their company’s name? 
  2. Calls to action – It’s generally accepted that you shouldn’t go for the hard sell in the first email (although feel free to test that). If that’s true, what should you ask people to do after reading your email? Booking a meeting might be your campaign objective, but maybe aiming for a non-committal response and starting a conversation in your first email is more effective. Think about the next steps your prospects could take, then test them out. 
  3. Images – Another accepted practice is sending your emails in plain text, as that’s more in line with how people normally email each other (and is less likely to be picked up by an over-zealous spam filter). However, depending on your product or service, you still might want to consider sending images. We’ve been excited about our recent Vidyard integration, so you might want to test the response to videos in your email. Even if you don’t want to send any other images, many find including a picture in their signature can help increase reply rates. Will it work for you? 
  4. Body text – Of course, it’s important to think about the body of the email itself. How are you opening your email? Are you highlighting the pain points your prospect typically faces or promoting the value your solution offers? Are you going to use humor or do you want to present yourself as completely professional? Bonus point: Your email’s first lines are critical, as they’re often seen as a preview in the email client, so they are a good place to start your testing. 
  5. Sender name – People often focus on the subject line as the key variable affecting open rates, but the sender name can be just as important. ‘no-reply@company.com’ is a terrible sender name, and not worth testing. However, that still leaves several options. Is it better to use your personal name or your company name? What about using both?  
  6. Time and Day of Send  – One of the most popular questions I’ve had is ‘what’s the best time to send an email?’ A search online will bring up as many possibilities as you can imagine, for every conceivable combination of day and time. At Reply, we’ve seen the most success sending emails on a Tuesday or Wednesday. However, your results will most likely vary. Test it for yourself. 

Note: While we’ve focused on emails for this post, outbound campaigns can also include other methods such as cold calls and engagement over social media. While this may take a bit more thought, it’s entirely possible to use the same principles to test these methods. For example, you could A/B test the opening lines of your cold call script (is it better to start with an introduction, a question, or a pleasantry?) or you could test variations of your LinkedIn connection request.

Remember to keep the test as controlled as possible and consider the additional variables these methods have; a cold call by Happy Harry may always outperform one by Blunt Barry, no matter how they start the call, and a connection request from an active LinkedIn account that’s previously engaged with the prospect may always outperform one by an inactive account, regardless of the connection request.

Create the control and challenger variations

Create your A and B test pieces, in line with the best practices for outbound campaigns. Remember, you should only change the one variable, otherwise, you won’t be sure what caused the change in response. If you’re using Reply for your outbound campaign, it’s easy to create and send your variationsno calculator required! 

Review the results

After you’ve completed the test, it’s time to see how the variations performed. Assuming the results are statistically significant, you will hopefully have a clear winner. Even if you don’t have a winner though, you should be able to learn something from the results. Was your prediction correct? If so, how can you use that in your future emails? If not, why might that be? As long as you’re learning something from the test, that’s the biggest win.

Start all over again

A/B testing should be ongoing. Even when you’ve discovered the ‘perfect’ email, with every conceivable potential variable explored, times change. What worked for you today may fail tomorrow. Keep testing, keep learning, and keep improving.

Ready to A/B test your sequences to see what works best?

Try now

Start your free trial with Reply

My first experience with A/B testing at Reply

It’s one thing reading about A/B testing, but how does all of this come together in real life? To help you with your efforts, here’s exactly how I started A/B testing at Reply.

When I first started, I had just 10 days to come up with my first sequence. At the time I didn’t have nearly enough information to come up with a winning subject line, messaging, pitch, or CTA. To improve my chances of finding the winning combination, I created 3-4 different variations for each step, essentially creating an A/B/C/D test.

So for the first email, I tested four different subject lines, but with the same email template.

In a separate test, I also tried different CTAs in the end with the same email template.

After that, I analyzed my results to see which variations worked better: which subject line was the best, what email template performed the best, what CTA generated the most positive responses, what follow-ups generated a higher reply rate, etc.

With the first wave of A/B testing completed, it was time to start thinking about the next one. I took the best performing ideas from the first iteration, then brainstormed some new ones. Combining these created our second sequence.

As you can see, this second iteration uses the best follow-ups from the first iteration. Now it was possible to measure whether the reply rate changed based on the CTA or the time intervals between emails (e.g. should I send my second follow up in three days or five days?).

From here I created three further iterations (five iterations in total) to create an optimized sequence that generates great results for us. Here’s that final, optimized sequence…

And here are the stats from that sequence:

Looking back, we learned a lot in the process and were able to identify several key reasons why the sequence worked out for us.

  1. We discovered the ‘{Company Name} <> Reply.io’ subject line generated the highest possible open rate.
  2. We discovered we achieved the best results by sending follow-ups in the same email thread, rather than sending every new follow-up as a separate email with a new subject line. That way our recipients can easily read our very first initial email with our pitch and original CTA.
  3. Following on from that, we discovered there’s no need to pitch our product in every single follow-up. We used the first email as a pitch which correlates with their business, then used the follow-ups (in the same email thread) to remind them about that pitch.
  4. We discovered that sending the second email in four days (instead of three) generated the highest reply rates. From there, sending the third and fourth follow-ups within seven days produced better results than sending them within five days. By respecting their silence and not flooding their inbox, we received more replies.

Of course, we’ve continued to create more iterations, not content with our ‘perfect’ sequence. We are regularly testing out new subject lines, pitches, CTAs, and ideas. However, we still have the same framework and foundation generating positive results, allowing us to create campaigns that consistently perform well.

Creating amazing outbound campaigns is easy with Reply. A/B testing can be implemented with a single click, helping you get the best results. Try it out today with a fully-featured 14-day free trial

Subscribe to our blog to receive the latest updates from the world of sales and marketing.
Stay up to date.

Related Articles

Generative AI in Sales: How It’s Changing the Game in 2024

Generative AI in Sales: How It’s Changing the Game in 2024

Generative AI in Sales: How It’s Changing the Game in 2024
Virtual Sales Assistant: Why You Need One in Your Sales Team Today

Virtual Sales Assistant: Why You Need One in Your Sales Team Today

Virtual Sales Assistant: Why You Need One in Your Sales Team Today
New Integration: Reply.io + Clay.com

New Integration: Reply.io + Clay.com

New Integration: Reply.io + Clay.com