Marketers who are able to strike an optimal balance between learning and performance gain an advantage over their competition.
When marketers look to optimize conversion rates on their websites, they face a practical tradeoff. On one hand, there is a need to learn by testing new ideas to see whether they perform better than existing ideas. On the other hand, there’s a need to drive performance by applying the best of what’s been learned to deliver results. Statisticians and engineers will often refer to this tension as the explore-exploit tradeoff.
This tradeoff sometimes leaves marketers between a rock and a hard place because visitor traffic is finite. If you allocate more traffic to driving lift based on previous learnings, you are limiting the speed of learning on new ideas. If you allocate more traffic to learning about new ideas quickly, you miss some upside in performance. Marketers who are able to strike an optimal balance between these competing goals gain an advantage over their competition. These marketers work with and act on better information and use that learning to get better results.
A common approach marketers take to solving this problem is A/B testing. Marketers allocate a small amount of traffic toward learnings, testing one idea at a time, and then take action by applying what they’ve learned from their tests to all of their visitors. Unfortunately, this approach has three key weaknesses:
- It’s slow. For the large majority of websites, A/B tests take a long time to yield results, in part because small portions of traffic are used for testing.
- The loss in upside while testing can be large. A/B tests aren’t considered complete until they achieve statistical significance which can take weeks or longer. Throughout the testing period, the marketer is allocating 50% of their test traffic to one idea that is underperforming.
- It’s static. After the tests are complete and the winning idea is getting all the traffic, the learning stops. The market might change, users might change, and the competitive space might change. However, to recognize and adapt to those changes, an entire new testing cycle would be needed.
So what should marketers do?
To balance driving conversions from your site now and delivering the best possible results, you need to do three things well:
- Automatically adjust exploration to find the right explore – exploit balance
- Handle changes to audience traffic and changes to variations at any time
- Personalize what is shown at the individual level
We think it is possible to do all three of these things well and in the right balance, and we believe one approach that does all three of these well is predictive personalization. These systems automatically optimize the balance between learning and performance while enabling you to test many ideas simultaneously. They continuously measure performance and reallocate traffic to the then-best ideas to deliver results. We believe predictive personalization systems are the way to make the right tradeoff for marketers.