> Quick answer: Google Ads has three A/B testing methods: Custom Experiments for campaign-level settings, Ad Variations for copy changes across your account, and Performance Max asset tests for asset group comparisons. Run every test for at least 4-6 weeks to get reliable data.
What Is A/B Testing in Google Ads?
A/B testing in Google Ads runs two versions of a campaign, ad, or asset at the same time. One version is the control. The other is the treatment. You compare results and keep what wins.
Google's native testing tools make this structured. You don't have to guess which change moved the needle.
How Google Ads A/B Testing Works
Google Ads offers three distinct testing methods. Each targets a different layer of your campaigns.
Custom Experiments: Testing Campaign Settings
Custom Experiments split your original campaign's traffic and budget with a test version. Per Google's Ads Help Center, they work on Search, Display, Video, and Hotel campaigns. Shopping and App campaigns are not supported.
A 50/50 traffic split is recommended for the clearest comparison. Only one experiment can run per campaign at a time. You can schedule up to five tests per campaign overall.
The original campaign keeps running throughout the test. Your live performance doesn't drop while the experiment is active. When a test wins, you apply its settings to the original campaign or convert it into a new one.
One important note: changes you make to the original campaign during a test do not sync to the experiment. Keep both sides clean for valid results.
Ad Variations: Testing Ad Copy
Ad Variations let you test a single copy change across multiple campaigns or your entire account. Per Google's Ads Help Center, they use find-and-replace or text update methods on responsive search ads only.
This method is built for focused changes. Swap one headline. Change one CTA. See what drives more clicks at scale.
Traffic splits are cookie-based by default. Each user sees only one version. You set the percentage of traffic going to the variation.
Performance Max Asset Testing
Performance Max campaigns support asset group A/B tests. You split assets into a control group and a treatment group within the same asset group. During the test, those asset groups are locked. You cannot add, edit, or remove assets until the experiment ends.
Per Google's Ads Help Center, this feature is currently in beta. Check the Google Ads Help Center for the latest updates on availability.
Google recommends running Performance Max asset experiments for at least 4-6 weeks. Shorter windows rarely reach statistical significance.
Key Concepts: Control, Treatment, and Traffic Split
The control is your original version. The treatment is the version you changed. A clean test changes only one thing at a time.
Traffic split determines what share of users sees each version. Cookie-based splits are recommended. They make sure each user sees only one version across sessions. Search-based splits can reach significance faster but are less consistent at the user level.
For audience-based experiments, Google recommends at least 10,000 users in the audience list for accurate results.
When to Use Custom Experiments vs. Ad Variations
Use Custom Experiments when you want to test campaign-level variables on a single campaign. Bidding strategies, targeting settings, landing page differences. One campaign, deep variables.
Use Ad Variations when you want to test one copy change across many campaigns at once. A single headline swap across ten campaigns is fast and broad.
| Goal | Best Tool |
|---|---|
| Test a new bidding strategy | Custom Experiment |
| Test a headline change account-wide | Ad Variation |
| Test asset combinations in PMax | Performance Max asset test |
Best Practices for Running Tests
Run tests for at least 4-6 weeks. Shorter tests produce unreliable data.
Change one variable at a time. Multiple changes in one test make it impossible to know what actually drove the result.
Let the original campaign keep running. Google's experiment framework is built for this. Do not pause the base campaign mid-test.
Act fast after a test concludes. Apply winning settings or copy immediately. Delayed action wastes the learning.
Keep a test log. Note what you tested, when you ran it, and what won. You can run up to five tests per campaign but only one at a time.
How Coinis Supports Your Google Ads Testing Strategy
Coinis does not publish to Google Ads directly today. That's on the roadmap.
What Coinis does right now: it builds the creative and copy assets you need to run better tests inside Google's native tools.
Start with Ad Intelligence. Research what's working in your space before you build a test. Competitor ad copy, formats, and creative angles are all visible before you write a single word.
Then use Revise to generate test variations fast. AI Rewrite ad copy turns your existing text into fresh angles for new ad variations. Variate creates multiple visual versions of one creative. You build a full test set without a design team.
Brand Profile keeps every variation on-brand. Headlines, body copy, and visuals all pull from the same brand context. Your test assets look like they belong together.
Export your assets and load them into Google Ads as Ad Variations or Performance Max assets. The testing happens in Google. The creative work happens in Coinis.
Or let Coinis do it.
From a product URL to a live Meta campaign. AI-generated creatives. On-brand copy. Direct publish to Facebook and Instagram. Real performance reporting. All in one platform.
Start free. Upgrade when you're ready.
15 AI tokens a month. No credit card.
Frequently Asked Questions
What is the difference between Custom Experiments and Ad Variations in Google Ads?
Custom Experiments test multiple changes on a single campaign at the campaign level, such as bidding strategies or targeting settings. Ad Variations test a single copy change, like a headline swap, across multiple campaigns or your entire account at once. Use Custom Experiments for deep, single-campaign tests. Use Ad Variations for broad, copy-focused tests.
How long should I run a Google Ads A/B test?
Google recommends running tests for at least 4-6 weeks. Shorter windows rarely reach statistical significance, which means the results can be misleading. Give each test enough time to collect data across different days, audiences, and search patterns before drawing conclusions.
Can I A/B test Performance Max campaigns?
Yes. Performance Max campaigns support asset group A/B testing as a beta feature. You split assets into a control group and a treatment group within the same asset group. Asset groups are locked during the test, meaning you cannot add, edit, or remove assets until the experiment ends. Per Google's Ads Help Center, the recommended runtime is 4-6 weeks.
What happens if I change my original campaign while an experiment is running?
Changes you make to the original campaign during an experiment do not automatically sync to the experiment version. This is important. Unplanned changes can corrupt your test results by creating differences between the two versions that you did not intentionally set up. Make any changes before the experiment starts or after it ends.