Advertisment

Dark Patterns: A shadow over consumer trust and experience

Manisha Kapoor, CEO & Secretary General, ASCI discusses the rampant dark patterns in digital shopping: AI chatbots, consumer manipulation, and the erosion of online autonomy.

author-image
Social Samosa
New Update
er

When consumer apps first emerged, they promised ease of use. But with rampant dark patterns, consumers now have to wonder if they are being taken for an online ride. As AI chatbots become the first port of call for consumer support, digital shoppers become easy targets without recourse to human intervention. Dark patterns force consumers to transact even if they aren’t comfortable, as they pervade impulse and essential purchasing websites and apps.

Dark commercial patterns are business practices employing elements of digital choice architecture, in particular in online user interfaces, that subvert or impair consumer autonomy, decision-making or choice. They often deceive, coerce or manipulate consumers and are likely to cause direct or indirect consumer detriment in various ways, though it may be difficult or impossible to measure such detriment in many instances.

Privacy deception, interface interference, drip pricing and false urgency are some of the recurring manipulations that mar the user’s experience during digital shopping.

Breaking down the purchase cycle

At the decision-making phase, time-based (a countdown timer) or/and stock-based pressures (a limited-time offer) create a sense of urgency, playing on consumers’ fear of missing out.

When booking a hotel room or buying apparel, it is common to find reminders declaring ‘Only two left’, or ‘Five others are looking at this right now’. Such ‘scarcity’ pressures the buyer to act before the deal disappears, possibly leading to impulsive decisions and unintended spending. 

User interface interference uses visual cues to either highlight or bury options that mislead users into thinking one is better than the other and selecting against their original intention. Using confusing fine print for discouraging details or guilt-tripping users into a choice are other examples of interface interference. Having to manually opt out of pre-selected options is an interference that many would be familiar with as it often appears at the checkout. 

While paying, consumers find more dark patterns such as drip pricing and basket sneaking take hold. Convenience fees, platform fees, and minimum basket value fees rear their heads as they are about to pay, inflating the final price. What may have seemed like a steal becomes a burden. Even then consumers, often than not, go through with the purchase.

In a recent case in the District of Columbia in the United States, against Stub Hub, the Attorney General alleged that instead of disclosing the total price upfront that consumers must pay, Stub creates a complicated order flow that causes consumers to invest (needless) time and effort into the purchase process before they learn what the price really is. This includes advertising a “deceptively low price” that doesn't include the mandatory fees that consumers must pay, a need to input their personal information before the full ticket price is disclosed to them, additional pressure to buy quickly by creating a clock timer, giving users the impression that tickets are scarce by including the number of people who have viewed the event in the last hour, by stating that tickets to the event are selling fast, and by indicating that the tickets the consumer is considering are the last ones available in that section.

As per a EU study on the advertising and marketing practices in online social media, 97% of apps and websites accessed by EU citizens contained at least one dark pattern. The study also highlighted that such practices are rarely used in isolation and one interface design may contain several such dark patterns.

The Indian scenario

A recent report by ASCI Academy and Parallel HQ "Conscious Patterns," analysed 12,000 screens from 53 top Indian apps across nine industries, finding an average of 2.7 dark patterns per app. The investigation revealed that that all apps but one used manipulative dark patterns. Health-tech apps had the most number of deceptive patterns at a whopping 8.8 patterns/ app. Privacy deception was the most common dark pattern deployed and was found in 79% of apps 

Both ASCI and the Central Consumer Protection Authority (CCPA) have guidelines on what constitute dark patterns and how the consumers’ ability to make informed decisions is impaired. 

Exploiting cognitive biases

Dark patterns take advantage of natural thinking habits or “cognitive biases” and push users towards choices that may not be in their best interest, for the brand’s profit. 

Exploiting cognitive biases influences user judgement to favour shortcuts rather than careful consideration. At checkout, many don’t abandon their carts despite dark patterns weighing them down with irritation and concern. People’s consistency bias and decision fatigue come into play. The former makes them stick to their choices, made with time and effort even if new information tells them that it is not the best decision, and the latter makes it a hassle to opt out of pre-selected options. 

For airline tickets, everything from a convenience fee to charges for seats and meals is piled onto the ticket price. If the user does not check out, there is the fear of not getting the current price again for a need that will eventually have to be met. Prechecked extra items like travel insurance prey on decision fatigue, when users give up having to choose.

Before checkout, shoppers have scarcity and loss aversion biases which make them fear losing out on a ‘deal’ that may not be genuine but only presented as one. Fake reviews trick many into thinking a product is in demand or effective, catering to their social proof bias. But many consumers still complete purchasing due to behavioural biases or lack of awareness.

The consumer experience

Overall, there is a lack of consumer awareness regarding the use of dark patterns, but once such a practice is identified, there is likely to be negative perception around such interfaces. The average consumers’ ability to discern such practices is limited. Consumers may also tend to accept these practices as part of their normal digital experience. There is a real danger that such practices become the default online norm. Particularly vulnerable are senior citizens and new Internet surfers, as may accept such patterns as an integral part of the online world.

Incessant exposure to dark patterns can create consumer resentment, and lead to unexpected costs, loss of time and trust, negative experiences, compromised privacy, behavioural manipulation, and limited autonomy. As people around the world turn more digitally-astute, this is set to change. The American research agency, Dovetail, surveyed 1,000 e-commerce and social media users between the ages of 18 and 54 to find they are annoyed at dark patterns, with a majority losing trust and some even boycotting the erring brands. In India, a quick-commerce platform recently drew flak for making consumers already on a premium free-delivery subscription, opt out of a delivery charge for every purchase. 

Globally, regulators are taking steps to curb dark patterns. The EU is leveraging existing laws like the GDPR, while India has introduced guidelines both at the self-regulatory and regulatory levels. The DPDP (Digital Personal Data Protection) Bill passed last year, will encourage better privacy norms. Greater regulatory action will also act as a string deterrent to such practices.

Way forward

Eventually, the choice to deploy such interfaces boils down to organisational ethics and culture, and how they wish to treat their consumers. Formal legislation will always play catch-up to technological workarounds. It is the brand’s custodians who must make the choice to build long-term trust. In time, there is likely to be an opportunity for brands to become positive disruptors by putting consumer interest at the heart of interface design and repositioning existing brands as manipulative and deceptive. Interfaces where users feel safe and are perhaps exactly what we need now.

This article is penned by Manisha Kapoor, CEO & Secretary General, ASCI. 

Disclaimer: The article features the opinion of the author and does not necessarily reflect the stance of the publication.

asci Manisha kapoor AI chatbots consumer trust