dConstruct 2008: part three
(See part one for some fascinating travel and eating anecdotes, and part two for the first half of Joshua Porter’s workshop)
Designing for sign-up
Contrary to what I (and presumably others) thought, this isn’t about the sign-up form! It’s more to do with the need to articulate the core value of what’s being offered to the user. In pseudo-physics terms, it’s about converting potential energy into kinetic energy.
Research has been done that suggests that sign-up is nine times harder than we think it is:
- users overvalue their current solution by three times, and
- providers overvalue their offering by three times
Getting from interested to sign-up - there are three types of visitors:
I know I want to sign up
I need to know more
I’m sceptical
To meet those three visitor types where they are, there are three strategies for sign-up:
- immediate engagement - you can use the site and still see what’s in it for me (WIIFM) without signing up
- articulate benefits and features
- use levels of description, e.g.
** Netflix.com sign-up screen
** Tripit.com
The carrot vs. the stick: Netvibes lets you do stuff first, without signing up. If you want to save your stuff, though, you have to register. This is a stick, rather than carrot, approach but it can work well.
Luke Wroblewski calls this “progressive engagement”, though Joshua prefers the term “instant engagement”. Some examples are Slide.com and Freshbooks
Reputation
bq. “Social problems don’t have technical solutions”
The Yahoo Developer Network have some design patterns for reputation.php?pattern=reputation in their pattern library.
Reputation rewards need to be tied to quality as well as quantity; you need to reward (and highlight) desired behaviour. The example given was that of Heidi Klausner, a reviewer on Amazon whose number of reviews equates to >5.5 per day since 2001. There is some speculation out there about her authenticity (i.e. whether she’s actually a team of reviewers) but her reviews seem to contain nothing that can’t be gleaned from the back cover of the books in question.
What’s interesting is that, while no-one will get anywhere near Ms. Klausner in raw number of reviews, other reviewers perform better using other, more intelligent metrics. Some have a much better ratio of helpful reviews to total reviews; others have more reviews marked as helpful.
Reputation isn’t just about people’s behaviour and actions; personal profiles contribute as well. “The profile must fit the domain”, however, so don’t ask users for the name of their dog on a business-focused site, for instance. Yelp.com is a good example of a site that combines lots of different reputation patterns.
bq. “Optimise for value-added behaviour”
Reciprocity
On LinkedIn, when someone recommends a colleague they’ve worked with, it’s very rare indeed that the recommended person doesn’t return in kind with a recommendation (so much so, that a failure to return the compliment is seen as an insult). This feeling of indebtedness can also apply to websites that users place value on, e.g. Amazon (again!).
Amazon now order their reviews by most helpful, not by date, and they display the rating spread (i.e. how many of each star rating), not just the average rating. 1-star reviews are important because people want to know the worst experience people have had, as well as the best, so they don’t buy a bad product.
eBay’s feedback profile contains lots of data. The join date is a very important piece of information; a longer membership period increases trust. eBay removed reciprocity from seller/buyer feedback as it created a toxic relationship between the two parties: sellers wouldn’t leave feedback until a buyer left positive feedback; if a buyer left negative feedback, the seller would respond in kind. Ultimately, who needs to know how good a buyer is? Apparently, eBay had wanted to remove the seller->buyer feedback for a few years but eventually bit the bullet and did it.
What can’t you do?
- Amazon.com: you can’t rate a review as helpful (or not) from a reviewer’s list of reviews. If possible, this would allow bulk, targeted fanning or hating of specific reviewers, taking the focus away from the review and onto the reviewer. An ad hominem form of rating, if you like.
- You can’t Digg someone’s Diggs on your Digg friends’ activity or profile page - again, this would make it about the person, not what they had Dugg.
- Facebook’s newsfeed had users up-in-arms when it launched, as they saw it as an invasion of privacy. None of it was data that was new or previously unavailable; it was just aggregated in one place for the first time. Facebook’s response was to introduce fine-grained privacy controls, which apparently no-one really uses but their mere existence pacifies people, making them feel in control.
Metrics (for pirates - AARRR!)
The usage lifecycle goes like this:
bc. Unaware -> Interested -> First-time use -> Regular use -> Passionate use
Compare to this metrics scheme: AARRR, which stands for:
- Acquisition
- Activation
- Retention
- Referral, leading to…
- Revenue (profit!)
Your sign-up process is a funnel; it’s very likely that, of 100 people who hit a landing page, only 60 will make it to the sign-up form, and only 20 will complete sign-up and hit the confirmation page. All funnels are leaky. [It’s possible to track funnels in Google Analytics, though better solutions exist, apparently]
However, number of users is not a valuable metric (take note, sales and marketing!). What is important is engagement, but how do we measure that?
Engagement analysis
Retention is a good measure of engagement. If people keep coming back, you’re doing something right.
- Do a Cohort Analysis_study on registered users who visit or do some other activity on the site.
The Viral Loop
How well are users bringing new users into the system?
- Word of mouth
- Embed a widget
- Mimic an action (e.g. Facebook apps)
- Forced sign-up
- Direct invite
The problem with Metrics
bq. “You get what you optimise for”
bq. “At Blogger, we determined that our most critical metric was number of posts. An increase in posts meant that people were not just creating blogs, but updating them, and more posts would drive more readership, which would drive more users, which would drive more posts.” – Evan Williams, founder of Blogger.com (and Twitter)
Fin
That’s the end of the workshop notes - check back for the conference write-up!
To be continued…