A/B Testing in the Ads Manager UI

  1. Help Center
  2. Campaign measurement and analytics
  3. A/B Testing in the Ads Manager UI

Starting January 2023, we’re launching access to self-serve A/B testing in the X Ads Manager UI. 

A/B testing is an experimentation framework that allows you to test performance hypotheses by creating randomized, mutually exclusive audiences that are exposed to a particular test variable.

We offer self-serve access to our A/B testing tool to enable our partners to gather learnings and fine-tune their campaign strategies with data that is statistically significant. 

We support A/B testing of creative assets only:

  • Images 
  • Videos 

  • Text 

  • CTAs 

Important: A/B testing is also available to marketing partners via the Ads API. Please refer to this resource for more information. 

 

Use cases

A/B testing is most often used to support: 

  1. Optimization use cases for performance-focused advertisers who want to understand what works best on X in order to optimize their campaigns.

  2. Learning use cases for branding-focused advertisers who want to use learnings to inform their creative strategy.

Example:

  • I run a Pet Store and want to understand which creative drives the best click-through-rate (CTR) % for my brand. My creative assets feature Dogs, Cats, and Rabbits. 

    • Split Testing Variable: Creative

    • Ad Groups: 3 (with one ad group ID per group).

 

Supported functionality

Objectives

  • All objectives are supported, except Takeovers. Dynamic Product Ads are not supported if A/B Testing is toggled on.

Variables available for testing

  • Creative 

*(1) One A/B test can have up to 5 mutually exclusive buckets (i.e. individual ad groups per A/B test). 

Budgets

Reporting

  • You have access to the same media metrics and conversion data normally available in Ads Manager.

     

  • Additionally, you can access A/B test-specific information like:

    • Winning test cell

    • Volume of events driven

    • Cost per event

    • Win probability if the test was run again

You can access this information by: 

1. Selecting the icon next to the campaign in which you ran your A/B test: 

2. Selecting “A/B Test Data" 

3. Selecting your success metric 

4. The winning cell from your split test will be labeled: 

 

Best practices

A/B Tests should be executed with an ‘all else equal’ principle. This means that if the goal of the test is to compare creative variables across ad groups, the audiences and other setup parameters should mirror each other (‘all else equal’) to provide clean, viable learnings.

It’s important to note that changing one or more variables - eg: making the age difference between ad groups in an A/B test - would invalidate the A/B Test results due to the variations across audiences. 

When setting up your A/B test, Ads Manager will display warnings denoting the inclusion of multiple variables between cells (though will not prevent this usage).

 

Setup

To use the new self-serve A/B testing tool in the X Ads UI…

  • Navigate to ads.x.com

  • In the top right corner, select “Create Campaign”

  • Select your desired Objective

    • Click “Next”

  • In the Campaign form, under “Campaign details”, toggle “A/B test” to ‘ON

     

    • Click “Next”

     

  • Input relevant ad group information under “Ad group details”

    • Click “Next”

  • Create at minimum 2 ads. These will be the variables tested in your A/B test. 

    • Click “Next”

  • Review campaign details

    • Click “Launch campaign”

 

Reporting

To review the outcome of your A/B test... 

1. Navigate to Ads Manager

 

2. Find the campaign in which you ran your A/B test. Then select the icon next to the campaign: 

3. Select “A/B Test Data" 

4. Select your success metric 

 

5. The winning cell from your split test will be labeled:

 

Interpreting Results

The cell with the lowest cost per selected metric will be labeled as the winning cell. 

The win chance metric associated with a losing cell indicates the likelihood that that cell will win against the winning cell by the end of the A/B test. Ideally, you want as low of a Win Chance as possible for losing cells to have the highest degree of confidence that the results of the test are not due to chance alone.

 

Frequently asked questions

Was this article helpful?
Thank you for the feedback. We’re really glad we could help!
Thank you for the feedback. How could we improve this article?
Thank you for the feedback. Your comments will help us improve our articles in the future.

Ready to get started?

Did someone say … cookies?

X and its partners use cookies to provide you with a better, safer and faster service and to support our business. Some cookies are necessary to use our services, improve our services, and make sure they work properly. Show more about your choices.