Save Hours of Work: Customized Script Creates Mass Campaign Experiments
Produced by Hanapin Marketing
For some brands, testing bid strategies periodically is important. However, the testing process is very time-consuming. And when there are multiple brands and campaigns to test, the amount of time it takes to test, declare winners, and implement the winning strategies can be overwhelming for any team. For one client, more than 250 campaigns across dozens of accounts in an MCC, needed new tests.
After bringing up this issue to Hanapin’s Analyst Team (a team that works specifically on creating custom scripts and dashboards for clients, so that they can manage their data, their way), they created a script that creates bid strategy experiments for 250+ accounts, within an MCC account, at the same time. While the script did all the time-consuming work in setting up and implementing the experiments, the Hanapin team evaluated and analyzed each test. Each campaign was evaluated individually, so that the implementation was only made in cases where the strategy worked best for that campaign. Not only did this save the team hours of time, but it created a more personalized approach to the testing process, and ultimately improved the cost per lead.
How the Script Worked
The script created and executed the experiments from the MCC level, allowing the team to create tests for dozens of accounts at the same time,. Prior to this script, the approach was to test just a few campaigns within an account and apply winnings to the entire account. While that might have worked for some accounts, it would not have worked for all and could have negatively affected results.
Here’s how the process worked:
- The Hanapin team determined the “control” and the “experiment” bids (In this example, the control was an enhanced cost per click strategy while the experiment bid strategy was to maximize clicks.)
- The Hanapin team implemented the script at the MCC level to create bid strategy tests in 250+ campaigns across dozens of accounts.
- All eligible auctions were split 50/50 between the experimental campaigns and the control campaigns
- The test ran for 30 days
- The Hanapin team went through every experiment and declared the “winner” for each
- Winning strategies were then implemented across 250+ campaigns
Want heroic results?
WHY PARTNER WITH US »