Running A/B Test Email Campaigns in the CDP

Optimize your email sendouts by testing variations and automatically sending the best-performing variant

A/B testing is a powerful method for fine-tuning your email campaigns. With the CDP’s built-in A/B testing functionality, you can test different subject lines, templates, sender names, and more—then automatically send the winning variant to your audience based on performance criteria like open rate or click-through rate.

Whether you’re trying to increase ticket sales, improve engagement, or test messaging before a big event, A/B testing helps you make data-driven decisions, directly from your campaign configuration flow.


Benefits of A/B Testing in the CDP

  • Data-driven optimization: Make strategic decisions based on actual supporter behavior

  • Automated winning selection: Send the best-performing email to the rest of your audience

  • Flexible testing options: Test up to 20 variants if needed (although 2–3 is typically enough)

  • Time control: Define when and how the winning variant is selected and sent


🛠️ Step-by-Step: Setting Up an A/B Test Campaign

1. Enable A/B Testing

On the email campaign configuration page, click the "A/B Test" toggle or button to activate A/B testing for your campaign.
Screenshot 2025-06-13 at 10.11.47

2. Define Basic Settings & Target Audience

Provide the following details:

  • Test Name

  • Test Description

  • Labels (e.g., Ticketing, Sponsor Campaign)

  • Select one or more segments to include or exclude

  • Optionally include or exclude individual email addresses
    Screenshot 2025-06-13 at 10.07.32

4. Create Variants (A, B, C… Up to 20)

  • Define the number of variants and the percentage split per variant (e.g., A: 10%, B: 10%, Winning: 80%)
    Screenshot 2025-06-13 at 10.07.49

  • Customize each variant with:

    • Subject line

    • Preheader

    • UTM campaign name

    • From name, from email, reply-to email

    • Email template (you can define the first variant and reuse the same on further variants)

💡 Tip: Keep most settings consistent and change only one or two elements between variants for a clean test (e.g., just the subject line).
Screenshot 2025-06-13 at 10.08.13

5. Configure Winning Variant Settings

  • Select winning criteria:

    • Open rate

    • Click rate

    • Unique opens/clicks

    • Total opens/clicks

  • Decide how and when to select the winner:

    • Delay-based: e.g., wait 8 hours or 1 day after sending

    • Scheduled send: define a fixed time to evaluate and send the winner (great for fixed match/event timings)

🎯 Optional: If no winning variant should be sent, you can remove the final delivery step and assess results manually.
Screenshot 2025-06-13 at 10.15.34

6. Review Summary Before Sending

Before sending or scheduling, the platform will show:

  • Number of profiles in each variant

  • Total emails 

  • Unsubscribed emails

7. Send or Schedule

  • Send immediately or schedule for a later time


💡 Best Practices for A/B Testing

  • Test strategically: Focus on one variable (subject, CTA, sender) per test:
    - if you optimize for open rate, then work on the subject line variants
    - if you optimize for click rate, then work on the template layout and call-to-actions

  • Allow enough time: Give the system a fair evaluation window before selecting a winner, as a minimum 4-6 hours

  • Match timing to real-world needs: Use scheduled winning send-outs for events with strict deadlines

  • Start small: 2–3 variants are usually enough to draw meaningful insights


📊 After the Send, Analytics

All sent variants—including the winning version—will appear in your Email Campaign List View, with stats on opens, clicks, and deliveries. Use these insights to continuously improve future campaigns.

Screenshot 2025-06-13 at 10.09.08