A/B Testing Over $1M Worth of Display Ads – 5 Things We Learned
Published: October 27, 2015
Author: Jason Puckett
The open-endedness of display ads can make testing and optimization challenging. The question “Where do I begin?” has run through the minds of account managers and designers alike for many years.
At AdBasis, we analyzed display ad meta data from more than 60 eCommerce and SaaS companies, looking at the variation meta data associated with ad design and comparing it to performance. The goal of this analysis was to understand and provide five useful insights for creating or iterating display campaigns. With that in mind, this article is designed to give you practical action items that can be implemented quickly and easily.
But first, the sample size. How many ads did we look at?
So yes, we looked at a lot of ads. As part of this evaluation, we extracted design characteristics from these ads and are now going to discuss the elements that had the biggest impact on Click Through Rate (CTR) performance.
The data for this study has been compiled by identifying each experiment within our database where the specified criteria (an identified ad design element) are being directly tested.
Once these experiments were identified, the percent changes in CTR were calculated for each experiment individually. AdBasis deliberately analyzed each experiment in isolation; no experiment was weighted more heavily than another. As a result, we were able to compensate for differences in ad group targeting, volume, market, timing, etc.
1. Humans Work Best
When selecting the background or main images to use in your display ads, using images that contain people is the most reliable bet. Across the A/B tests we examined for all 60 brands as a whole, ads that contain images of people have a 12 percent higher CTR than those which do not. The takeaway: people like people!
2. Use Photos Rather Than Designs
If using images of people isn’t a valid option for your brand, using photos of other nouns (places or things) is a great option. Ads that contain a simple color or rectangle as the background do not perform as well as ads that use photos.
We studied all of our A/B tests that contained variation comparisons of photos versus “designed” background images. In these cases, the average CTR of ad variations that contained photos as the background images is 9.9 percent higher.
There is some debate amongst marketers about which calls to action work best and which will ultimately get the attention of potential customers. Using action words like “Get” (7 percent) and “Now” (8 percent) both yielded significant improvements to CTR. Instilling a sense of urgency in the ad will increase CTR; the data proves it.
4. Include Images of Your Products
Whether you are selling a piece of software or a physical consumer product, showing it off will help. We queried our database looking for experiments that used product screenshots or product images.
Within experiments that tested product screenshots or product images vs. other image types, the ad variations containing product images improved CTR by 18.2 percent.
5. Blue is the Best Button Color
We also queried our database for ads that contained rectangles that made up less than 33 percent of the canvas. We would consider these “buttons.” We ran a query on the typical color of buttons created in our tool and posed the question, “Which button color works best?”
The answer is blue. When blue buttons are compared to others, the blue button color yields an average increase to CTR (6.19 percent). No other color had a common trend of increased CTR.
This was a very interesting study for us to perform; hopefully it is equally valuable to our readers. While it is great to use hunches to determine your optimal ad design, be sure to test your hunches and make decisions based on data.
These insights will give you a nice “jumping off” point, but be sure to test these findings for your brand individually!