Strategies for Acing A/B Tests

Quick Overview

A/B tests are one of the most effective methods of maximizing your marketing dollars. The learnings help you keep marketing budgets cost effective while driving better results.   In basic terms, A/B testing means taking two different creative options and presenting them to two groups—a control group (A) and a test group (B), to determine which…

Table of Contents

    A/B tests are one of the most effective methods of maximizing your marketing dollars. The learnings help you keep marketing budgets cost effective while driving better results.  

    In basic terms, A/B testing means taking two different creative options and presenting them to two groups—a control group (A) and a test group (B), to determine which option moves you closer to your marketing goals. You can use A/B strategies for something as small (but important) as testing subject lines, or for bigger efforts like testing out entirely new rebrand designs. Here are steps you can take to ace your A/B tests.

    Running ads with A/B testing allows you to run campaigns with different copy, calls to action (CTAs), and design options. With the analytics on ad response, you can remove the lower performers in favor of the better performing ads. 

    You can also create derivatives of higher performing ads for future campaigns. Social media platforms are set up to support A/B testing (for example, Google allows you to create ad sets when running Google ads).

    To test ads, it’s important to determine what you want to test first. While this may seem obvious, it’s not uncommon for businesses to put up two different ads without fully scoping out what learnings they want to obtain. 

    For example, testing out different photography options with the same CTA in ads may tell you that customers click the ad with photo A more often than the ad with photo B. Still, if what you really want to do is test CTA effectiveness, the photo preferences don’t give you the learnings you want about the impact of your CTA on conversions. 

    The same principle applies when trying to test too many variables at once—you may learn that A has a higher click through rate (CTR) and conversions than B, but you won’t be able to tell why if each ad had 4-5 different variables (see split testing below for options with multiple variables). 

    In brief, A/B tests work best when you have a clear testing goal in mind and can isolate which relevant variables you’re testing for your campaigns. 

    A side note—depending on your ideal client profile, you can also consider testing ads on different platforms. For example, a university running ads to attract prospective students may want to test parent-focused ads on Facebook and student-focused ads on TikTok, using the same best practices of A/B testing above.

    Another common mistake to avoid—testing ads without also testing the landing page or web page the ad leads to. For example, if CTRs are high but you’re not getting conversions, it may be the landing page that needs tweaking, not the ad itself. 

    You can experiment with different elements to drive people through the funnel, including: 

    • Reducing the number of fields in a response form 
    • Moving the CTA higher on a landing page, or incorporating it in multiple places 
    • Testing the style of the CTA element (e.g., does it stand out enough from other design elements?) 
    • Editing copy—removing extra fluff and making sure the benefits of converting are clear

    Book a call here ➝.

    Beyond ads, you can test your website at a broader scale with split testing. With split testing, you have the ability to try out completely different campaigns at once, or test out dramatically different new designs, such as complete rebrands. 

    • Split URLs drive people to web pages only visible to that particular group. You can analyze results such as bounce rate, time on page, inquiries, and more, to determine response to the new marketing elements. 
    • Split testing gives you the ability to refine new design ideas as you go. You can funnel a certain percentage of people to the new page or site experience, make changes accordingly, and keep making tweaks until you’re seeing the response you want. At that point you can fully replace the former pages/site with the new one(s). 

    A/B testing is a useful approach for both cold outreach emails and emails/newsletters to existing customers or subscribers. 

    • Testing two subject lines to compare open rates is one of the simplest but most effective ways to make improvements to email communications.  
    • You can test types of content campaigns—such as sending out two emails on the same topic, one that has a longer narrative form, and one that’s very short and broken up into bullets. 
    • While Classic City often recommends plain text CTAs in emails, you can test the wording and stylization of the plain text CTA (even testing it bold vs. not bold). If the email is more stylized or in HTML format, you have additional design options to play with and test. 
    • You can A/B test or even split test newsletter designs to determine response to newsletter designs updates. 

    This is an overview of ways to ace A/B tests, but there are many more considerations and A/B testing opportunities to consider. If you would like to learn more about testing the effectiveness of your marketing dollars and what that could look like for your organization, schedule a call today.


    👇🏻

    If you’re ready to connect with us and learn more about Classic City, there are three ways to make that happen:

    Add depth to your marketing team here.

    Learn about our website-building process here.

    Ask us a general question here.