A/B Testing SaaS Products ProductCamp Provo March 29 , 2014

A/B Testing SaaS Products
ProductCamp Provo
March 29th, 2014
Nate Carrier
A/B Testing: An Introduction
• 50/50 randomized split between two
experiences
• Used in
▫ Web development
▫ Internet marketing
▫ SaaS Products
Why A/B Testing?
• Web development and internet marketing
▫ What metrics do you try to improve?




Conversion
Sales
ROI (on ad spend)
Engagement
• SaaS (cloud-based software)
▫ Improve user experience (UX)
▫ Increase engagement
▫ Drive long-term profitability
When Should You A/B Test?
• Before introducing a new feature
• Small day-to-day improvement
• When you want to improve the customer exp.
• When you want to increase sales/subscriptions
• All the time!
How to A/B Test
• Define goal / question
▫ Why are you running the test?
• Identify metrics
▫ How do you identify a successful test?
• Design test experience
• Set up data collection
▫ Google analytics (very limited), Adobe Marketing
Cloud, custom, etc.
• Analyze data
Let’s Analyze Data
• Test and Control have different # of users!
• Google Analytics
▫ Email: [email protected]
▫ Pw: ProductCamp Provo (with the space)
▫ Shortcut: A/B Test on Voting
• Excel Data
▫ http://bit.ly/1mdQYJz
 Limited data pushed into website database
 User, Test Group, Post (at vote level)
Some Ideas of What to Look For
• Difference between Test & Control on:
▫ Votes per Visitor
▫ Visit duration
▫ Sessions voted for (somewhat time consuming)
What Insights Have You Found?
• How do the test and control groups differ?
▫ Number of votes per visitor
 Test > Control
▫ Visit duration
 Test < Control
▫ Different sessions voted for?
Statistical Significance
• Provides confidence in result
▫ Insignificant: diff caused by random variation
▫ Significant: most likely caused by something
 Analytics and statistics only reveal correlation
 Is that a bad thing?
• Can require more data than we can get
• Requires more skill to calculate
▫R
Connect with Me
@nate_carrier
linkedin.com/in/natecarrier
[email protected]