My first real introduction to Larry Kim was when I helped coordinate the webinar: The 10 Weirdest A/B Tests Guaranteed to Double Your Business Growth.
Larry came up with that genius headline and naturally we had a full house that day. I’ve been hooked on Larry ever since.
We wanted to catch up with Mr. Kim and ask him a few questions around conversion rate optimization, testing, and digital marketing.
Here’s what he had to say..
Yes. This is true and becomes increasingly the case the more you do A/B testing.
Here’s why: Conversion rates aren’t random. There is a theoretical limit to how high say, a whitepaper download form can convert at. It’s not like you can keep doubling the conversion rate forever.
This is essentially the law of diminishing returns. Every time you discover a new “winner,” the harder and harder it becomes to find the next winner and the extent to which the new contender beats the previous champ becomes smaller and smaller.
You can read about this and other CRO Truth Bombs here.
Some advanced users might be annoyed at seeing different homepages or unusual URL parameters based on different browsing sessions. It bugs me a bit but probably just because I’m an internet marketer and so, of course, I notice that. My wife is a doctor and probably wouldn’t even notice.
There are too many to list. But off the top of my head, the two that come to mind are:
My greatest marketing win was the creation of my AdWords Grader. It’s a free tool that has been used by a gazillion people and has graded gazillions of dollars of ad spend.
In hindsight, I would have made it three years earlier. That’s the problem with great ideas – they’re usually obvious in hindsight.
Say you really wanted to expand your market to a new audience – you could optimize for new visitors (as opposed to repeat visitors) – even though the conversion metrics for “new” people is usually much worse than people who are familiar with your brand.
I also often advise people to optimize for user engagement metrics like Click Through Rates, even though they are technically just one step removed from a business metric like Cost Per Conversion.
Probably Facebook Ads. And you wouldn’t even need 1 million.
These are not necessarily new, but are exciting and have new features being added every week:
A/B testing is likely to be pushed down a level in the marketing stack. If you’re just testing small changes to flows or presentation, an AI-enabled system should be able to do this monkey work, automatically.
As an example, in Facebook Ads, you can’t even A/B test your ad creatives. The platform auditions them for you and automatically picks a winner. In Google AdWords, the platform can even re-write your ads for you if it thinks your ad copy is poor. The same will happen to landing pages (which are essentially just bigger ads)
That is not to say that other important marketing disciplines like User Experience, Design (etc.) are any less relevant today than before. But the mechanical grunt work behind A/B testing is likely to become increasingly automated.
I do a lot of marketing. Most marketers, over time, get promoted to managerial roles that are less hands on. I still spend a good part of my day doing internet marketing activities like blogging, SEO, Adwords, Facebook Ads, content marketing, etc.
Earlier on in my career, I believed the CRO Kool-Aid – that small A/B tests, when added up, could make a big difference. Naturally, I would want to attribute changes in results to my hard work in conducting these experiments. But now that I’ve been doing this for over a decade I realize that I may have been misattributing cause and effect. The biggest CRO hack by far is brand awareness. That people, consciously or unconsciously, dramatically favor clicking on and buying from the brands they have heard of and love.