
Conversion Rate Optimization Testing & Measurement [CRO Chat Streamcap]
Justin Rondeau (@Jtrondeau) lead today’s chat with a topic and question set titled “CRO Testing & Measurement.” The following is the transcribed Streamcap from the live chat:
Q1: For people currently testing: Where do you focus your testing efforts?
- Value proposition – building a connection with the reader and making an offer they’re interested in.
- Typically focus testing on higher volume segments (landing pages with more volume) then roll out to segments with lower traffic.
- I look at deep conversion oriented pages. Gives short term conversion value, easier LTV and great for segmentation insights.
- One our recent investments has been surveys of existing customers.
- Biggest area of opportunity.
- How do you define BaoO? Untapped segment or under performing one?
- Typically highest volume of traffic (especially for PPC)
- How do you define BaoO? Untapped segment or under performing one?
- For the near future we are doing tests on delivery method for B2B materials.
- I focus at the intersection of traffic and opportunity. High traffic pages with potential to do better. In essence, anywhere that we specifically have evidence (analytics, customer, session recording) +opportunity (traffic, value).
- Depends. Highest volume of traffic, lowest hanging fruit, or sometimes ‘big win’ areas are defined specifically by customers.
- There’s never a ‘1 size fits all’ solution for selecting your tests. That’s why it is important to continuously test
Q2: Which (a|b) or multivariate testing platforms have you used before and what did you like or dislike about them?
- Primarily in-house build w/ GWO due to budget-by-in from HiPPOs. That’s changing in 2012 as more clients get familiar with CRO.
- We have seen major shifts of the people using free platforms to paid platforms during our testing awards.
- Awareness helps obtain budget.
- We have seen major shifts of the people using free platforms to paid platforms during our testing awards.
- GWO, Optimzely – good for free or cheap. Maxymizer – not flexible enough, ok otherwise. Autonomy Optimost (use this now and love it for coping with *anything*)
- What sort of flexibility do you require for your testing efforts?
- Being able to do complex design changes, redirection handling, variable injection and heavy data analysis. Plus autonomy do the heavy lifting on test setup, so we don’t have to. We effectively outsource boring stuff. Maybe not the boring stuff, but the things we don’t add value to. Also, having a multi-iterative wave setup for testing, allows for fast whittling of poor variables. That kinda stuff.
- What sort of flexibility do you require for your testing efforts?
- I’ve used Unbounce, Google Website Optimizer, and Test & Target, and internal systems.
- GWO, VWO, Unbounce, Sitespect, Monetate.
- We primarily use Unbounce, but we have also use GWO for some comparison work.
- We’ve also used GWO for website testing. LiveBall for landing page and page parts testing.
- Our tool is great for A/B testing, which is what people should be doing 9 out of 10 times. But, there comes a time for MVT.
- If you’ve got the traffic for stat significant results with MVT testing in a respectable time frame, I’m all for it.
- Volume for Stat. Sig. is key and tough for niche segments.
- If you’ve got the traffic for stat significant results with MVT testing in a respectable time frame, I’m all for it.
- Ooops – I did forget to add Unbounce, who have more brains on this than most people around AND a good tool. We do a lot of 8/16/32 way MVT – more insight if you remove variables quickly. You get more insight.
- It is tough to pick an over arching element to test. It really is decided by the conversion goal and the particular audience.
Q3: What page elements do you test most often? Why?
- Images bc they = 1k words, Copy bc messaging is important, and Complete Format/Layout for major impact bc cost/resources reg.
- Copy, because it’s around 50-70% of the lift in any test (straplines, Call to actio wordings) : 28M tests show us this. Images, because they convey powerful emotions and get attention, esp people images. And funnels, because everyone has a rubbish funnel – it just needs to stop leaking money like a firehose.
- Headline, Image, Layout, CTA tend to be the heavy lifters for landing pages.
- What other pages do you run tests on outside of Landing Pages?
- We also test internal paths, but any portion of a funnel can be treated like an individual landing page.
- What other pages do you run tests on outside of Landing Pages?
- Yes title too. Layout agreed but if it drives *something* not just a shuffle round. For example, we know that uniform and eye gaze make differences in images now, so take different photos etc. Avoid stock photos for hero images – it hits your conversion. Use ‘real’ people from your company.
- For landing page A/B test experiences: multi-step, microsite, segmentation.
- I love looking at the paths ppl took to a final conversion, keeps things in perspective & inspires for future tests.
- Me too – my favourite thing is watching say 100 videos (speeded up 5-10x) in clicktale,. of a new funnel or feature. Agreed on writing. We find it’s the video that plays in the customers head, not the words themselves, that shifts them.
- The language used in the benefits content is also of major importance. Many testers create a mis-match between visitor and tone
- If I had to limit to just two, it would be headline and CTA.
- Innovate for big lifts, then find out which elements drive conversions through iterative testing (headlines, CTAs, forms)
- Headlines -> benefit-driven are a must. Love imagery testing, especially directional “gaze,” gender & ethnicity
- Making sure the Title of page maintains scent from previous click.
Q4: What is the most difficult part of the testing process in your organization? How do you get over these difficulties?
- Getting enough programming time to set up the pages to test. 4.2 Still working on that one. We have a programming dept. Nice that it’s in house, but they’re so busy with other things testing gets put to the back.
- In-House is great. But i can see where testing would be put on the ‘back burner’. Tough to get everyone involved
- Getting buy-in & budget from clients who are new to CRO. But once you do, it’s so powerful and eye opening.
- Designing tests so theyre likely to succeed, help us learn things & make outcome driven (work backwards from business prb).
- We seem to have only service providers here, not sure this applies. My main challenge is finding cro-crazy analytical staff!
- It’s definitely tough to get everyone to sign on. ESP if they don’t understand CRO and think more traffic is the answer.
- Testing is an overlap of hard facts and creativity. hard to construct ‘winning’ tests if you can’t balance the two.
- Misguided political HiPPO deference instead of being more like DOGs. (Data Optimised Goalseekers)
Q5: What are some testing mistakes/pitfalls you have noticed or committed?
- Testing too many variables or ones that were too similar. Designing tests too big for traffic/confidence. Testing randomness. Not testing for at least two business cycles. Not monitoring changes in external traffic (adverts, ppc budgets etc.) Not monitoring for seasonal changes or traffic composition changes, market changes etc.
- Very often we see people users changing traffic patterns in disproportionate ways.
- Oftentimes there is pressure to “call” a test early, even though it hasn’t reached statistical significance.
- Biggest misconception I have seen is that people think ‘All Traffic is the Same’. This leads to very skewed results.
- Seen too many Landing Pages developed by unqualified designers who are not familiar with CRO. Clue is often the “Submit” button. The mistake here is treating landing pages like any old page without message match, select CTAs and more.
- The biggest one – designing a test where an outcome means you do nothing.
- Testing random stuff, toying with elements that will not produce significant lift.. testing without hypothesis’
- Conservatism
- Sample sizes too small, confidence not explained, not comparing cross country results.
- Testing on low traffic pages, keeping a test running for too long, testing unimportant things (i.e. bullets)
- Being in meetings when you could be testing or working on it.
- Important to test & optimize creative by traffic source (PPC, Display, Email, etc.)
- People “saying what do you think?” instead of “what do users think?” Phonetically close but mindset million of light years apart.
Resources
- CROChat Member List – on Twitter
- Digital Marketing and Measurement Model by @avinash
- The Definitive How-To Guide For Conversion Rate Optimization – SEOmoz
- Google Website Optimizer
More CRO Chats
Don’t forget to stay tuned for the next #CROchat on Thursday at 12 noon Eastern, 9 am Pacific and 5pm in the UK. Same Chat time, same Chat channel.
PPC Chat
Also, join us for #PPCchat on Tuessdays 12 noon Eastern, 9 am Pacific and 5pm in the UK for a similar sessions revolving about everything PPC. Those streamcaps can be found on Matt Umbro’s PPCChat Blog
CRO Chat Participants
Check out the CRO Chat Twitter list to see and connect with all current and prior participants.
- James Svoboda (@Realicity)
- Justin Rondeau (@Jtrondeau)
- IonInteractive (@ioninteractive)
- WhichTestWon (@WhichTestWon)
- Paul Kragthorpe (@PaulKragthorpe)
- Aaron Weiche (@AaronWeiche)
- Alex Cohen (@digitalalex)
- Carlos del Rio (@inflatemouse)
- Craig Sullivan (@OptimiseOrDie)
- Jessica Collier (@jsscacollier)
- Joseph Weller (@josephpweller)
- Michelle Morgan (@michellemsem)
- Paul Gailey (@paulgailey)
- Peep Laja (@peeplaja)
- Unbounce (@unbounce)
About the Behind-the-Scenes Streamcap Guy
This is a post by Paul Kragthorpe. Connect with me @PaulKragthorpe.
Comments are closed.