Harry Brignull is a London, UK-based independent user experience designer with a PhD in cognitive science. He is also the founder of Dark Patterns, which is dedicated to, in his words, "naming and shaming websites that use deceptive user interfaces." This article is based on a presentation he gave at the Search Marketing Expo in Munich this past April.
When Apple released iOS 6, one of the few new features not enthusiastically promoted by the company was Identifier for Advertisers (IDFA) ad tracking. It assigned each device a unique identifier used to track browsing activity, information advertisers used to target ads. Even though IDFA is anonymous, it's still unsettling to people who worry about privacy.
Fortunately, Apple included a way to disable the feature. You won't find it in the privacy settings, however. Instead, you have to go through a series of obscure options in the general settings menu. Now, "General” is a crappy name for a menu item. It’s mainly a bucket of miscellaneous stuff that they didn’t know what to do with. In the "General" menu, select "About." Down at the bottom of this menu, next to the terms of service and license items, there's a menu item listed as "Advertising."
If you haven't been here before, the only option in the advertising menu, "Limit Ad Tracking" is probably selected "Off."
But let's take a closer look at the way this is worded. It doesn’t say “Ad Tracking – Off” it says “Limit Ad Tracking – Off”. So it’s a double negative. It’s not being limited, so when this switch is off, ad tracking is actually on.
Off means on!
This is actually a great example of what I define as a "dark pattern."
A dark pattern is a user interface carefully crafted to trick users into doing things they might not otherwise do, such as buying insurance with their purchase or signing up for recurring bills. Normally when you think of “bad design,” you think of the creator as being sloppy or lazy — but without ill intent. Dark patterns, on the other hand, are not mistakes. They're carefully crafted with a solid understanding of human psychology, and they do not have the user’s interests in mind.
The thing about dark patterns is that you design them from the exact same rulebooks that we use to enhance usability.
Nielsen’s 10 heuristics, probably one of the most well-known set of usability guidelines, date back to the early 1990s. If we take three of them and invert them, we can describe Apple's UI strategy in the above example.
Visibility of system status. Instead of showing key status information, hide it. Do this with unclear labels, obtuse navigation, and untimely messages.
Match between system and real world. Instead of "speaking the user's language," the system should use "weasel wording" so that it appears to say one thing while it really says another.
User control and freedom. Take advantage of your users' natural capacity to make mistakes to have them accidentally complete actions that are beneficial to your objective.Trick questions
Marketing emails use this tactic all the time. You've probably seen this before. After you register to access something on the web, you're asked if you want to be placed on a mailing list. This particular approach is fairly standard but isn’t hugely effective because users have to take an explicit action to opt in. Chances are they’ll be in a hurry and a proportion of users won't even notice this text. Some websites use mandatory radio buttons with neither option (yes or no) preselected. This way the user can't get on to the next page without making an explicit choice. This in itself is still above-board. But if we think back to our anti-usability principles, we can see how not calling attention to this choice can be used to trick us into choosing something we don't actually want.
For instance, post-office.co.uk is designed to not draw any attention to the option, hoping that that you opt in by mistake. Here, a tick means no. It’s kind of clever because culturally, a tick is an affirmative action.
And they’ll definitely get opt-ins from those people who don’t pause to read this stuff. On the one hand this works — they will boost the mailing list opt-in rate — but a certain number of people will realize that the website is pulling a trick and they will swear angrily under their breaths. It’s probably not going to make them drop out just yet, but it is going to tarnish the brand's reputation, at least a little bit.
Royalmail.co.uk takes it a step further. Two rows of check boxes: the first is tick to opt out, the second tick to opt in.
Have you ever heard of a trammel net?
It’s a type of fishing net that is made up of two layers of different types of netting. The fish — or your user — can either get caught up by the first layer, or the second layer, or they can get stuck between the two. They’re banned in most kinds of commercial fishing, but it seems you can put them in your UIs without any legal repercussions.
These examples are all kind of tiptoeing around the problem, though. We could actually take this to another level entirely and get rid of any uncertainty whatsoever.
At present, Quora doesn't mess around with opt-ins or questions of any kind. They just opt you in as part of the terms of service. This is what you see when you’re registered — if you take the time to go to the email notifications page.
Currently, there are 35 email notifications. You’re automatically opted in to most of them.
The thing to take away here is to realize that although it’s easy to play these tricks, they will piss off your users. It’s quite useful to think of your brand’s relationship with your users in human terms. ￼
Okay so let’s move on to another dark pattern — Forced continuity.
Theladders.com is a fairly big US-based job board, founded nine years ago. They’ve got about 400 employees and made roughly $100 million dollars in revenue last year. VC-funded too. What I’m about to tell you is quite hard-hitting so please do check this yourself and let me know if I’m wrong. Anyway, let’s sign up for a free basic membership.
Since I’m signing up for free, there’s no point in reading this stuff, right?
After this I’ll go through a few sign-up steps and then I’ll search for a job. Here are my search results. Let’s say the second one down there looks really appealing.
I clicked "apply" a moment ago and I thought I was going to see the job details and the application form. Instead I’m seeing a paywall and it’s telling me that I need to upgrade to apply for this role!
Now what I don’t know right now is that this job ad is freely available on the web elsewhere.
Suddenly disabling text selection makes sense. They want to discourage people from bypassing their paywall by copying the job description and pasting it into Google as a search term. They don’t want people to get to the true source. In this case the job was published on Bloomberg’s careers site where you can apply for free.
I haven’t taken a large sample, but from a cursory analysis it looks like a fairly large chunk of the listings behind the paywall are available free elsewhere on the web.
What normal person explores the account settings pages?
That's tricky, enough, but the forced continuity doesn't come into play until you go to sign up. While the premium membership page prominently displays its pricing (from $25 for one month to $150 for a year) it obscures the fact that the one-month package is automatically renewed by displaying this information in gray 10-point text on a gray background at the bottom of the page. Once you've signed up, you have to dig into the membership page (under account settings) to turn it off. This is the same way that Apple hides the IDFA settings in iOS.
Here’s the last pattern: misdirection. Let’s imagine you’ve done a search for “cannot empty clipboard in Excel” and you find yourself on Experts Exchange.
The way that the page is designed, it looks like the answer is behind a paywall. In fact it’s just way, way down the page — right at the bottom.
This trick gives them an SEO benefit while simultaneously tricking users into subscribing. They’ve actually been doing this for years — since 2007, in fact.
This is a good case study showing what can happen if you systematically use dark patterns as part of your growth strategy.
Experts Exchange could still be a dominant force today, but they’re not. They got greedy, they used dark patterns, and everyone got annoyed with them and migrated to a friendlier, more ethical competitor.
When you look at your customers in aggregate, it’s easy to be very detached and impersonal about it. To understand the reality of what it’s like to be on the receiving end of your product, you have to zoom in.
Good design — and good business — is all about empathy with our fellow humans. In fact it’s not really limited to business — it’s society as a whole. It’s what defines us as humans. To understand the true impact of your designs, you have to work at a human level of focus. You have to see the whites of their eyes and their facial expressions. That’s really the whole point.
At the end of the day, you should evaluate what you really want from your customers. Do you just want them to just use your service, or do you want more?
Personally I think usage alone is cheap. A good brand is liked. A great brand is loved and respected. You’ll never reach that point if you use dark patterns.
[Note: All examples are current as of early 2013.]
We'll email you a reset link.
If you signed up using a 3rd party account like Facebook or Twitter, please login with it instead.
Choose an available username to complete sign up.
In order to provide our users with a better overall experience, we ask for more information from Facebook when using it to login so that we can learn more about our audience and provide you with the best possible experience. We do not store specific user data and the sharing of it is not required to login with Facebook.