image date

Conversion Rate Optimization Testing & Measurement [CRO Chat Streamcap]

CRO Chat - Conversion Rate Optimization Testing & MeasurementJustin Rondeau (@Jtrondeau) lead today’s chat with a topic and question set titled “CRO Testing & Measurement.” The following is the transcribed Streamcap from the live chat:

Q1: For people currently testing: Where do you focus your testing efforts?

  • Value proposition – building a connection with the reader and making an offer they’re interested in.
    Peep Laja (@peeplaja)         
  • Typically focus testing on higher volume segments (landing pages with more volume) then roll out to segments with lower traffic.
    James Svoboda (@Realicity)         
  • I look at deep conversion oriented pages. Gives short term conversion value, easier LTV and great for segmentation insights.
    Justin Rondeau (@Jtrondeau)         
  • One our recent investments has been surveys of existing customers.
    Unbounce (@unbounce)         
  • Biggest area of opportunity.
    IonInteractive (@ioninteractive)         
    • How do you define BaoO? Untapped segment or under performing one?
      James Svoboda         
      • Typically highest volume of traffic (especially for PPC)
        IonInteractive
  • For the near future we are doing tests on delivery method for B2B materials.
    Unbounce (@unbounce)         
  • I focus at the intersection of traffic and opportunity. High traffic pages with potential to do better. In essence, anywhere that we specifically have evidence (analytics, customer, session recording) +opportunity (traffic, value).
    Craig Sullivan (@OptimiseOrDie)         
  • Depends. Highest volume of traffic, lowest hanging fruit, or sometimes ‘big win’ areas are defined specifically by customers.
    Jessica Collier (@jsscacollier)         
    • There’s never a ‘1 size fits all’ solution for selecting your tests. That’s why it is important to continuously test
      WhichTestWon (@WhichTestWon)         

Q2: Which (a|b) or multivariate testing platforms have you used before and what did you like or dislike about them?

  • Primarily in-house build w/ GWO due to budget-by-in from HiPPOs. That’s changing in 2012 as more clients get familiar with CRO.
    James Svoboda         
    • We have seen major shifts of the people using free platforms to paid platforms during our testing awards.
      WhichTestWon         
      • Awareness helps obtain budget.
        James Svoboda         
  • GWO, Optimzely – good for free or cheap. Maxymizer – not flexible enough, ok otherwise. Autonomy Optimost (use this now and love it for coping with *anything*)
    Craig Sullivan         
    • What sort of flexibility do you require for your testing efforts?
      WhichTestWon         
      • Being able to do complex design changes, redirection handling, variable injection and heavy data analysis. Plus autonomy do the heavy lifting on test setup, so we don’t have to. We effectively outsource boring stuff. Maybe not the boring stuff, but the things we don’t add value to. Also, having a multi-iterative wave setup for testing, allows for fast whittling of poor variables. That kinda stuff.
        Craig Sullivan         
  • I’ve used Unbounce, Google Website Optimizer, and Test & Target, and internal systems.
    Carlos del Rio (@inflatemouse)         
  • GWO, VWO, Unbounce, Sitespect, Monetate.
    Alex Cohen (@digitalalex)         
  • We primarily use Unbounce, but we have also use GWO for some comparison work.
    Unbounce         
  • We’ve also used GWO for website testing. LiveBall for landing page and page parts testing.
    IonInteractive         
  • Our tool is great for A/B testing, which is what people should be doing 9 out of 10 times. But, there comes a time for MVT.
    Unbounce         
    • If you’ve got the traffic for stat significant results with MVT testing in a respectable time frame, I’m all for it.
      WhichTestWon         
      • Volume for Stat. Sig. is key and tough for niche segments.
        James Svoboda         
  • Ooops – I did forget to add Unbounce, who have more brains on this than most people around AND a good tool. We do a lot of 8/16/32 way MVT – more insight if you remove variables quickly. You get more insight.
    Craig Sullivan         
  • It is tough to pick an over arching element to test. It really is decided by the conversion goal and the particular audience.
    Justin Trondeau         

Q3: What page elements do you test most often? Why?

  • Images bc they = 1k words, Copy bc messaging is important, and Complete Format/Layout for major impact bc cost/resources reg.
    James Svoboda         
  • Copy, because it’s around 50-70% of the lift in any test (straplines, Call to actio wordings) : 28M tests show us this. Images, because they convey powerful emotions and get attention, esp people images. And funnels, because everyone has a rubbish funnel – it just needs to stop leaking money like a firehose. 
    Craig Sullivan         
  • Headline, Image, Layout, CTA tend to be the heavy lifters for landing pages.
    Unbounce         
    • What other pages do you run tests on outside of Landing Pages?
      WhichTestWon         
      • We also test internal paths, but any portion of a funnel can be treated like an individual landing page.
        Unbounce         
  • Yes title too. Layout agreed but if it drives *something* not just a shuffle round. For example, we know that uniform and eye gaze make differences in images now, so take different photos etc. Avoid stock photos for hero images – it hits your conversion. Use ‘real’ people from your company.
    Craig Sullivan         
  • For landing page A/B test experiences: multi-step, microsite, segmentation.
    IonInteractive         
  • I love looking at the paths ppl took to a final conversion, keeps things in perspective & inspires for future tests.
    WhichTestWon         
    • Me too – my favourite thing is watching say 100 videos (speeded up 5-10x) in clicktale,. of a new funnel or feature. Agreed on writing. We find it’s the video that plays in the customers head, not the words themselves, that shifts them.
      Craig Sullivan         
  • The language used in the benefits content is also of major importance. Many testers create a mis-match between visitor and tone
    Unbounce         
  • If I had to limit to just two, it would be headline and CTA.
    Aaron Weiche (@AaronWeiche)         
  • Innovate for big lifts, then find out which elements drive conversions through iterative testing (headlines, CTAs, forms)
    IonInteractive          
  • Headlines -> benefit-driven are a must. Love imagery testing, especially directional “gaze,” gender & ethnicity 
    Jessica Collier         
  • Making sure the Title of page maintains scent from previous click.
    Joseph Weller (@josephpweller)         

Q4: What is the most difficult part of the testing process in your organization? How do you get over these difficulties?

  • Getting enough programming time to set up the pages to test. 4.2 Still working on that one. We have a programming dept. Nice that it’s in house, but they’re so busy with other things testing gets put to the back.
    Michelle Morgan (@michellemsem)         
    • In-House is great. But i can see where testing would be put on the ‘back burner’. Tough to get everyone involved
      Justin Trondeau         
  • Getting buy-in & budget from clients who are new to CRO. But once you do, it’s so powerful and eye opening.
    James Svoboda         
  • Designing tests so theyre likely to succeed, help us learn things & make outcome driven (work backwards from business prb).
    Craig Sullivan         
  • We seem to have only service providers here, not sure this applies. My main challenge is finding cro-crazy analytical staff!
    Peep Laja         
    • It’s definitely tough to get everyone to sign on. ESP if they don’t understand CRO and think more traffic is the answer.
      WhichTestWon         
  • Testing is an overlap of hard facts and creativity. hard to construct ‘winning’ tests if you can’t balance the two.
    WhichTestWon         
  • Misguided political HiPPO deference instead of being more like DOGs. (Data Optimised Goalseekers)
    Paul Gailey         

 Q5: What are some testing mistakes/pitfalls you have noticed or committed?

  • Testing too many variables or ones that were too similar. Designing tests too big for traffic/confidence. Testing randomness. Not testing for at least two business cycles. Not monitoring changes in external traffic (adverts, ppc budgets etc.) Not monitoring for seasonal changes or traffic composition changes, market changes etc. 
    Craig Sullivan         
  • Very often we see people users changing traffic patterns in disproportionate ways.
    Unbounce         
  • Oftentimes there is pressure to “call” a test early, even though it hasn’t reached statistical significance.
    Jessica Collier         
  • Biggest misconception I have seen is that people think ‘All Traffic is the Same’. This leads to very skewed results.
    Justin Trondeau         
  • Seen too many Landing Pages developed by unqualified designers who are not familiar with CRO. Clue is often the “Submit” button. The mistake here is treating landing pages like any old page without message match, select CTAs and more.
    James Svoboda         
  • The biggest one – designing a test where an outcome means you do nothing.
    Craig Sullivan         
  • Testing random stuff, toying with elements that will not produce significant lift.. testing without hypothesis’
    Peep Laja         
  • Conservatism
    Paul Gailey (@paulgailey)         
  • Sample sizes too small, confidence not explained, not comparing cross country results.
    Craig Sullivan         
  • Testing on low traffic pages, keeping a test running for too long, testing unimportant things (i.e. bullets)
    Joseph Weller         
  • Being in meetings when you could be testing or working on it.
    Craig Sullivan         
  • Important to test & optimize creative by traffic source (PPC, Display, Email, etc.)
    IonInteractive         
  • People “saying what do you think?” instead of “what do users think?” Phonetically close but mindset million of light years apart.
    Paul Gailey         

Resources

More CRO Chats

Don’t forget to stay tuned for the next #CROchat on Thursday at 12 noon Eastern, 9 am Pacific and 5pm in the UK. Same Chat time, same Chat channel.

PPC Chat

Also, join us for #PPCchat on Tuessdays 12 noon Eastern, 9 am Pacific and 5pm in the UK for a similar sessions revolving about everything PPC. Those streamcaps can be found on Matt Umbro’s PPCChat Blog

CRO Chat Participants

Check out the CRO Chat Twitter list to see and connect with all current and prior participants.

About the Behind-the-Scenes Streamcap Guy

This is a post by Paul Kragthorpe. Connect with me @PaulKragthorpe.

Like this post? Rate it, Save it, Share it!

Post Rating: 5.00
based on 1 rating(s)

Click a star to rate:
1 Star2 Stars3 Stars4 Stars5 Stars
Loading...
vader small
ADD COMMENT

Comments are closed.

  • SEMpdx Board Member

    Board Member
    2012-2019
  • MNSearch Board Member
    Co-founder
  • Boost CTR Best PPC blogs
  • Techipedia Best of 2010
close pop

WebRanking Newsletter

Sign up to stay up to date on the latest digital marketing news, trends and strategies.



No thanks I don't to want stay up to date

joe gengquotelWe started working with WebRanking about 11 years ago and they have been amazing. Read More...
Joe Geng - Superior Glove