News

Dark Patterns: The questionable art of boosting conversion rates

Dark Patterns are questionable user-interface techniques that some web designers employ to nudge or subtly trick users into doing things. They’re known to boost conversion rates, but where does the slippery slope end? UX expert Harry Brignull investigates...

DarkPatterns

Back in the Seventies, cognitive psychologists started to realise that all humans tend to make the same categories of mistakes, which they named cognitive biases. They had effectively found a set of mental Achilles heals that we’re all prone to.

It became obvious that this discovery could be applied in different ways. We could use this knowledge for the benefit of mankind, to help avoid things like industrial accidents and catastrophic errors in judgement – or we could use it as a weapon of deception, to get one-up on our fellow humans.

In about 2005, cognitive biases became a hot topic in web design. What’s interesting is the way we as an industry reacted to this knowledge. Sadly, we didn’t turn the lens of analysis on ourselves – we didn’t say ‘hey, what does this mean about the way we make design decisions?’ or ‘How can we prevent these biases from tripping up our users?’. A lot of web designers decided to go for the dark side and asked ‘how can we exploit these biases? How can we use them to really push our conversion rates up?’

Like casino architects who install bright lighting and hide clocks, or advertising executives who sell products on half-truths, a number of us have decided to apply our new-found understanding of psychology to create subtly deceptive user interfaces that nudge or trick users into doing things – using techniques now known as dark patterns. Whether it’s low cost airlines that sneak travel insurance into your basket, or eCommerce sites that signs you up for a monthly membership through hidden small-print, these tricks have started to become increasingly common. As an industry, we desperately need to take a stance – which of these patterns are acceptable, which are borderline and which should be outright banned?

How Do Dark Patterns Work?

The interesting thing about dark patterns is that they are designed from the exact same corpus of knowledge that we use to enhance usability. This means that if you’re a good web designer capable of creating great user experiences, you probably already have the know-how to create some very sinister dark patterns. All you need to do is take the usability principles you know and then invert them, as shown in the table below.

Why do businesses use Dark Patterns?

Dark patterns typically boil down to a judgement about ethics. There is always going to be a tension between what customers want – low prices and outstanding service – and what businesses want – namely maximizing profit. It’s no surprise that businesses want to experiment with what’s possible. After all, they often find combinations that customers find acceptable. For example, it’s slightly manipulative that supermarkets put bread and milk at the back of the store to tempt people to walk past the rest of their stock but people don’t tend to mind. IKEA takes this principle even further, and turns the entire store into a maze – customers might moan a little, but they keep flocking back. Similarly, nightclubs often allow the queues outside to run around the block even when they’re not yet busy inside, just to attract more passers-by. People seem to accept that it is something they need to do to stay in business, and it’s just the cost of being a customer with them.

The real allure to online businesses is that dark patterns push conversion rates up. They tend to win in A/B tests and, as far as the analytics data goes, they appear to be a good thing. In reality though, there are many things that analytics data doesn’t capture. Whenever a customer notices that they’ve been caught out by a dark pattern, they’ll develop a negative opinion about the personality of the brand. If that brand continues to try to take advantage of the customer, they’ll begin to hate them and they’ll spread that opinion among their friends. Even if the company has no ethical qualms and doesn’t mind being perceived as a bit low end, there’s still a very real risk that customers will migrate to a less frustrating competitor. After all, this is the web – competitors are only one click away.

Trick Questions

Trick questions are one of the most common types of dark pattern, often seen in the marketing communication options of registration forms. Trick questions rely on the fact that people usually scan web pages rather than reading them word-for-word like a novel. For example, since checkboxes are usually taken to have a positive sentiment, some sites use double negatives to flummox users and make them believe they are opting out when they are actually opting in to.

A tick usually means yes – except when combined with a negative statement, tripping up many users.
Another similar deceptive trick is to show a series of marketing options but to alternate the sentiment, so that ticking the checkbox causes an opt-in on one line, and causes an opt-out on another. Users have to really keep on their toes to avoid being duped. Here users are likely to be confused by the way the sentiment is switched on different lines.

Trick questions are known to boost conversions, so they look impressive on your analytics dashboard, but they lower the quality of the lists generated. What’s more, they can end up really irritating users, defining the personality of your brand in their minds as one that’s quite annoying. Arguably, if you don’t mind being perceived as a seedy ‘price-‘em cheap, stack-‘em high’ type of business then this is just about okay, but on the other hand, users only have a limited amount of patience and your competitor isn’t far away.

Forced Continuity

This is a particularly devious trick that involves getting users to sign up to a paid monthly membership when they think they are only buying a one-off product.
One large fashion retailer in the USA has recently had a class action lawsuit raised against them for using this pattern. Their implementation of the trick involved sticking a clause in their terms and conditions which stated that buying a single item would automatically enrol customers into their VIP membership program.
On its own that sounds benign enough, until the customer checks their credit card to find a repeating monthly charge of a certain amount – a sum small enough to go unnoticed for quite some time. Very cheeky indeed, and most likely illegal in the UK. Small print can be used to hide all sorts of tricky clauses, but surely this is going too far?

Bait and Switch

This involves enticing users in with an attractive sales pitch, only to reveal a far less desirable outcome. Bait and switch is one of the oldest tricks in the book, and it even features in fairy tales like Rumpelstiltskin. In spite of this familiarity, it’s still an effective trick on the web. One popular implementation used on travel sites is to hook the user in on a low price, cleverly labelled with the prefix ‘From…’ then showing a high price when they select their specific dates and configuration at a later stage. An even more worrying variant of this is allegedly being tested by some unscrupulous online retailers (a type of ‘dynamic pricing’) whereby the cost of tickets or other time-sensitive items are automatically increased if a user returns to a product detail page after viewing it a few days previously – not because of increased scarcity, but simply because of the user’s behaviour pattern.

Sneak into basket

This pattern was popularised by low-cost airlines who have a habit of sneaking insurance into users’ baskets when they’re trying to buy flights online.

Most eCommerce websites show an upsell page prior in their checkout flow, serving the pur
pose of enticing you into buying extras and add-ons. For airlines, travel insurance has one of the best profit margins among the upsells, so some sites subtly preselect this option, requiring the user to take action to opt out, rather than to opt in. This additional ‘cognitive friction’ is enough to push sales, and many users don’t even realise they’ve bought insurance until it’s too late. Often when they do realise, the price point is so low (usually around £10) that they feel it’s barely worth the effort to call up and jump through a load of hoops to cancel it. If a user doesn’t stop to read the details here, they’ll end up buying the insurance upsell without even realising.

Faraway Bill

If an organisation hides a user’s bills, then it’s easy for them to overspend. This is a grey area, and it’s not clear if it is a dark pattern in all cases. Under the premise of green initiatives, many organisations have switched to eBilling, where they need to log in, navigate to the appropriate area, and click through a number of menu items. This is unlike the days of paper bills where all you had to do was open an envelope. Since many billing systems run on legacy software that is hard to update, some might argue that this is just bad design rather than actively nefarious. Either way, more effort needs to be made to communicate with end users.

Friend Spam

Apps and games ask for permission to access to your account. Some exploit this and secretly publish content as if it is was written by you. One person gives an app access to their account, it then publishes an endorsement that appears to be written by the user. Following this, a few of their friends see the post, trust it as a reliable source, they all register, and the app publishes more endorsements. This permissions dialogue is asking a user to allow a game to post endorsements. But how can a user endorse something they’ve yet to see?

Privacy Zuckering

When social networks first appeared, many consumers didn’t realise that ‘if you’re not paying for the product, you are the product’. It’s in a social network’s commercial interests to collect your personal information and expose it in ways that are commercially beneficial, such as behavioural ad targeting, using your face to endorse products, or encouraging you to post about brands. However, they need your legal permission to do this, so how do they get around it? Funnily enough, they don’t have a large button that says ‘sell my personal details to advertisers’ – they’d go out of business. Instead they obfuscate their user interface in such a way that hides your settings and makes it hard work for you to get the exact level of privacy that you want. Of course there’s nothing wrong with a business model that relies on targeted advertising, but it needs to be to be transparent and honest if it’s going to earn the trust and respect of their user-base.

Roach Motel

This is when a site has fantastic usability in registration, but is difficult to leave. Most websites have a lot of attention focused on perfecting their sign-up process, because this is where membership revenue comes from. However, churn is always a problem: a business needs to keep the inflow higher than the outflow or it will die. The best solution is to provide a good service, tempting users to stay of their own accord. However, the dark pattern approach is to try to trap them, to generate more revenue from recurring membership fees. For example, instead of having a Cancel Account button, the business could require the user to ring a call centre. This is more effort, so the user might delay, generating more revenue. When they do finally call, the call centre operative can try to convince them to stay, offering a last-ditch discount or freebie.

User Profile Price Discrimination

It’s easy for a server to find a user’s location and platform, but is it ethical for an eCommerce site to alter prices if they think you’re rich? A recent Wall Street Journal article (on.wsj.com/Tj1W2V) found that a number of different US eCommerce companies were detecting users’ locations and serving up higher prices to people in richer areas. The companies argued that this was done simply to match the prices in their local bricks-and-mortar stores. Users were understandably upset when they found out, because the price changes seemed arbitrary and punitive – it’s not as if it was costing the company any more to serve each different location. It’s interesting to consider that price discrimination is a very grey area. For example, people don’t mind the fact that tickets go up in price nearer the date. In the UK we’re used to paying more for electronic goods than the USA. But it seems that profiling users without their knowledge is an act that mixes privacy concerns with unfair discrimination, and many people find this too bitter a pill to swallow.

Conclusion

Back in the fifth century, Greek physicians realised the power that came with their newfound knowledge, so they wrote the Hippocratic oath. Should we do the same in web design? It’s clear that the combination of applied psychology and user interface design can be very powerful. Heavy-handed regulation rarely ends well – we all know how the cookie law panned out – so isn’t it time we take matters into our own hands and agree upon a set of rules to work by? One thing’s for sure: if we continue to ignore it, this problem is only going to
get worse.

×