Since I attended SXSW last week, I thought it'd be the right thing to do to share my notes from panels. They're incomplete, I've probably interpreted some statements wrongly, there are probably plenty of typos. But I felt I'd be a complete shmuck if I didn't do the community thang and shared my notes. So if you're not interested, apologies about the next few posts, which will each summarise a panel. At the end, I'll try to add links to other (better) coverage of the same panels to give the bigger picture. If you've taken notes or have something to add (like videos!), just leave a comment and I'll include it in my post.
First off, the "A/B Testing: Design Friend or Foe?" panel notes...
About: 'A/B Testing' is the practice of directing web traffic to multiple alternative designs to determine which is optimal. This method raises significant questions regarding the role of a designer and the need for a traditional design approach when deciding which design is 'best.' Are we being cut out of the equation? Panelists: Corey Chandler (Lead Interaction Des, eBay Inc), Jake Cressman (Producer, Electronic Arts), Chris Maliwat (Sr Dir of Prod Mgmt, Vuze Inc), Micah Alpern (Design Dir Social Se, Yahoo! Inc), Elliot Shmukler (Principal Prod Mgr, LinkedIn) AB testing: Achieving optimal design by testing different options on the web by sending traffic to multiple destinations and tracking behaviour
Usability testing and designer's own views can only get some answers.
Don't make predictions before you a/b test, it colours your thinking. Intuition is unavoidable but be careful.
Even expert designers and marketers don't know what will work best.
Don't crowdsource design. People want a lot of things and, by definition of what the crowd needs, the ipod would have ended up with a gigantic or feature-rich thing. (Shmukler)
It helps you optimise but not innovate from scratch or change the flow completely. To make a revolutionary change, it takes more than these microtasks. (Maliwat)
Cressman: Trying 20 ideas of where to put ads, including some completely absurd ones, and some of the absurd ones ended up working, but nobody could explain why.
Remember to balance time between optimising the small incremental changes and the large revolutionary volatile changes. Remember to do the big 10% change too. (Maliwat)
Don't forget about the things that you can't measure. It's not necessarily going to be numbers you can interpret. There's a larger issue, are people enjoying their experience? Are they going to recommend to someone else? (Chandler)
Be very careful what you A/B test, you *are* giving users a changing experience, choose your tests wisely. Be sure to have an agreed success metric.. How long before you freak out about the results? How long do you give it (the burn-in period) before expecting change?
Don't forget the "poking because it's new" factor when something changes. People will try and play with (but not necessarily go through with) the new features you put up.
How many times does it take before you see the new change, and how many times do you need to see it before you understand what it's about? You need to take this into account when making a change. (Alpern)