This week, we’re celebrating the release of Text Ad Zoom with a 5 part Q&A series featuring the authors who wrote the articles highlighted in The Ultimate List of PPC Ad Testing Resources. Each day this week, we had a different question and the answers.
Previous, we asked:
- What are the biggest text ad testing mistakes?
- How do pick which ads to test first?
- What factors have the greatest influence on testing?
- How important is text ad testing in overall campaign optimization?
Read below for the final question in our 5 part series and answers (in no particular order).
Text Ad Optimization Q&A #5: Have You Had Any Surprising Text Ad Testing Results?
Brad Geddes: I can’t count the number of times I was surprised by results. I’ve seen ads that I thought were terrible and should easily be beaten in a test and the ‘terrible’ ads worked surprisingly well. I did a test with one company where we just changed a single letter in an ad copy. We made a singular word plural as we were wondering if that would help increase overall average sale amounts. That test failed miserably and the plural version had a much worse conversion rate than the singular word. It taught us a lot about the shoppers, so it was a good test to have run as it helped us design some different landing pages.
My overall thoughts are that whenever I say, “I think this will work” means I don’t really know and that we should instead test it instead. Leave the ego outside of the account. Run some ad copy tests and let the metrics tell you what’s best for your account’s profits.
Andrew Goodman: Absolutely. We discover many things. Being “in business for 50 years” can come across as a negative — but being “online since 1997″ is a positive. I recently tried an ad that explained how users need to scroll to see a category of product, because the client’s site has a poor experience! That doubled conversion rates! You might learn that saying a food item is “delicious” doesn’t help, but calling it “crunchy” does. You do have to keep testing, because it’s really hard to predict what works.
I was gobsmacked when I heard of Jeremy Schoemaker’s claim that the winning ad could just be the one that had a certain *shape* — an “arrow shaped ad”!!
I’ve incorporated this gently into some ad tests, and I am pretty sure I’ve seen it working from time to time, for no discernible reason other than just that: the shape.
I also strip ads to the bone, trying the game of “shortest ad wins”. Sometimes, it does. I believe this speaks to the cognitive process of users, and also perhaps the minimalism of it flatters searchers who have had enough with the busyness of web pages and the excessive claims and information overload purveyed by the overstuffed world of marketing.
Seth Godin has a notion of helping natural selection along in organizations, by “increasing the mDNA diversity” (meme DNA) to allow for serendipity. You’ll never make cool discoveries without accidents, multiple sets of eyes, and even “lazy” ads that people just toss up on the board without overthinking. (Remember how Google’s founders came up with the “ingenious” Google UI because they “weren’t designers and don’t do HTML”?)
Having multiple sets of eyes and people with diverse perspectives and expertise trying ad experiments can be a plus for sure.
Jessica Niver: Most of my surprising results have revolved around how much different offers (50% off vs buy one get one free vs free shipping) impact CTR and conversion rate. I guess it’s logical, but to watch things fluctuate so drastically as a result of changes really demonstrated how important it is to test those things and implement what customers want to me. Also, testing the timing of launch for seasonal or holiday-based ads has been a lesson in how dramatically their performance can change and the importance both of using those types of ads to your advantage and getting them out of your account before they lose value.
Chad Summerhill: I got 12% overall lift in my brand campaign by adding the ® symbol campaign-wide.
Amy Hoffman: Each month at Hanapin we have an internal training day. Sometimes we’ll play a game called ‘Which Ad Won?’; in which, we’ll show two ads side by side for the same ad group and everyone has to guess which performed better. There are always surprising cases. It really just depends on your audience. Sometimes a rhetorical question wins, a strong call to action, or a mixture of the two. It really proves the importance of both knowing your audience and testing different techniques.
Erin Sellnow: Nothing that really shocked me, but over time I have found DKI is very hit and miss. For some clients, it does wonders, but for other clients it is like I can’t even pay people enough to click on a DKI ad. I can’t ever seem to predict correctly on if it is going to work well or not.
Pete Hall: Definitely. I’ve written ads that I personally thought were sub-par (no clear CTA, not really relevant, DKI-heavy, etc.) and watched them outperform my “perfect” ads by leaps and bounds. You have to remember that just because you think your ad is perfect, chances are most everyone else won’t, and nor do they care. It’s all about standing out in the competition for the user click.
If you work on an account for long enough, you start to test everything you can with text ads and these can make a big difference. A good example of this is testing display URL variations, such as adding www. or not, or adding things after the domain name, i.e. /Free, and seeing big differences in results.
Ryan Healy: Absolutely. Happens all the time. Although the more I write ads and analyze why one ad won and one ad lost, it becomes easier. You start to see patterns at work, principles at play.
But the surprises never stop. That’s one reason testing is so important. It provides you with empirical evidence of what’s working… right this minute… in your market.
That’s very valuable information to have.
Jeff Sexton: As indicated in the previous answer, it’s actually fairly routine to be surprised with test results. And I think that anyone involved with any sort of Web copy or Website Optimization testing will tell you that being surprised by a set of results is not only uncommon, but a pretty routine occurrence. Nobody bats a thousand when it comes to optimization of any kind, and I think that’s especially true for Text Ad optimization.
Tom Demers: A lot of times the things that surprise me the most are the tests that don’t win. A type of test I see frequently is this:
- A generic, keyword focused ad that speaks to a search query but is pretty vanilla is set up
- The copywriter comes up with a really clever, attention-grabbing approach that doesn’t include the keyword but seems to be a much more thoughtful approach to the creative
- The clever new approach gets clobbered by the simple, boring use of the keyword in the title and a straight-forward value proposition and call to action
We also cover a lot of ad tests on our blog (one per week) and a common theme I see there that surprises people is just that small changes lead to big impacts.
Crosby Grant: You bet! There was the ad where we intentionally misspelled things. I wish I had an example handy. It was just jarring enough that we got a little lift in CTR. At the time I think we were advertising for a certain post-secondary-education-for-profit (an online school), who shall remain nameless. I think it could be said that the ad was targeted to the audience
Rob Boyd: I’ve had some interesting results but one example comes immediately to mind because it’s on one of my favorite client accounts. I hope I don’t offend them too much in the event they come across this post but I get to make fun of myself a bit too.
I recently took on a client that had grown their PPC accounts spend to several hundred thousand dollars a month and had done so with absolutely no conversion or revenue tracking. They had been making decision based on feel for well over a year, with what adds up to millions of dollars. You can imagine the mayhem that presented itself when we turned the lights on and got tracking up and running…not pretty. There weren’t many things that they were doing right but the account was close enough that we turned it around very quickly. The one area where I can give them high praise is in their ad writing. I’ve been managing the account for over 4 months now and I have yet to create an ad that outperforms the ad structure they came up with long before we took on the account. It’s hard to imagine they got the recipe right without true goal metrics but, until I test an ad that beats it, I give them all the credit in the world! I think the lesson learned is that sometimes the client really does know their customer best.
Greg Meyers: The use of the DKI in Ads has traditionally been a fast and easy way to try and get the best CTR%. But, that’s not really the case when dealing with conversions. Also, with the birth of Ad Extensions and Product Feeds in Ads, it’s been a little more difficult to pinpoint success stories. With that said, future Testing will require many more levels of intricacies.
Bonnie Schwartz: Recently, I ran an ad copy test which was new ad vs. an ad that the client had been running in the account before we took over. The client’s ad was definitely decent, but it did not 100% follow best practices. The main thing was that this ad did not have a clear call to action. My test ad did. However, my ad was the clear loser overall pretty much across all ad groups. This taught me a valuable lesson that best practices are great to keep in mind, but oftentimes do not hold true. I think with ppc ad copy, small things you may not even think of, like the shape of the ad, the bold keywords, punctuation, may impact your ctr and conversion rate, beyond the actual messaging. The most important thing is to test and let the numbers speak for themselves because I have been proven wrong many times based on what I thought would work and what actually did.
John Lee: When I first started doing PPC, I was always surprised at how a simple change in punctuation or capitalization could affect the CTR and even the CVR of an ad. This doesn’t surprise me anymore, but it sure did then. More recently, Google changed how text ads are displayed in the top 3 spots. Essentially, when proper punctuation is included in line 1 of an ad, Google will place line 1 next to the headline making it look like an organic SERP listing. The initial results (positive) from this change surprised me. I wasn’t expecting users to be so easily fooled by the new PPC ads’ visual resemblance to organic listings!
Jon Rognerud: Maybe not surprising – but using big numbers ($) to exclude people from clicking the ad showed us that we can “lower traffic” & “increase quality leads”. That is surprising to many in terms of strategy.
Learn More About The Authors
- Brad Geddes – Certified Knowledge
- Andrew Goodman – PageZero
- Jessica Niver – Hanapin Marketing
- Chad Summerhill – PPC Prospector
- Amy Hoffman – Hanapin Marketing
- Erin Sellnow – Hanapin Marketing
- Pete Hall – Room 214, a social media agency
- Ryan Healy – BoostCTR / RyanHealy.com
- Jeff Sexton – BoostCTR / JeffSextonWrites.com
- Tom Demers – BoostCTR / MeasuredSEM
- Bradd Libby – The Search Agents
- Crosby Grant – Stone Temple Consulting
- Rob Boyd – Hanapin Marketing
- Greg Meyers – SEMGeek / iGesso
- Bonnie Schwartz – SEER Interactive
- John Lee – Clix Marketing
- Jon Rognerud – JonRognerud.com
- Joe Kerschbaum – Clix Marketing