Reading Time: 7 minutes
Is Social Proof a Dark Pattern?
5 (100%) 2 votes

Is it bad to use Social Proof in Digital Marketing? Is there a bad way and a good way to use Social Proof and Social Influence? And the big question: Is Social Proof a Dark Pattern?

In 2009, User interface designer Harry Brignull coined the phrase “Dark Pattern” and decided to maintain a website that documents them in an effort to shame the programmers behind them. 

Since 2010, government and non-profits had a raising interest towards monitoring and preventing dark patterns in order to protect consumers.

What is a Dark Pattern?

Brignull defines a Dark patterns as user interface design choices that benefit an online service by coercing, steering, or deceiving users into making decisions that, if fully informed and capable of selecting alternatives, they might not make.

This is the list of Dark Patterns that Brignull publishes on his website:

List of Dark Patterns

1. Trick questions

While filling in a form you respond to a question that tricks you into giving an answer you didn’t intend. When glanced upon quickly the question appears to ask one thing, but when read carefully it asks another thing entirely.

2. Sneak into Basket

You attempt to purchase something, but somewhere in the purchasing journey the site sneaks an additional item into your basket, often through the use of an opt-out radio button or checkbox on a prior page.

3. Roach Motel

You get into a situation very easily, but then you find it is hard to get out of it (e.g. a premium subscription).

4. Privacy Zuckering

You are tricked into publicly sharing more information about yourself than you really intended to. Named after Facebook CEO Mark Zuckerberg.

5. Price Comparison Prevention

The retailer makes it hard for you to compare the price of an item with another item, so you cannot make an informed decision.

6. Misdirection

The design purposefully focuses your attention on one thing in order to distract your attention from another.

7. Hidden Costs

You get to the last step of the checkout process, only to discover some unexpected charges have appeared, e.g. delivery charges, tax, etc.

8. Bait and Switch

You set out to do one thing, but a different, undesirable thing happens instead.

9. Confirmshaming

The act of guilting the user into opting into something. The option to decline is worded in such a way as to shame the user into compliance.

10. Disguised Ads

Adverts that are disguised as other kinds of content or navigation, in order to get you to click on them.

11. Forced Continuity

When your free trial with a service comes to an end and your credit card silently starts getting charged without any warning. In some cases this is made even worse by making it difficult to cancel the membership.

12. Friend Spam

The product asks for your email or social media permissions under the pretence it will be used for a desirable outcome (e.g. finding friends), but then spams all your contacts in a message that claims to be from you.

All these “Dark Patterns” refer to false information, misleading or hidden content and forced actions.

Dark Patterns at Scale: A controversial research document

In June 2019, a group of researchers from Princeton University published a document entitled Dark Patterns at Scale: Findings from a Crawl of 11K Shopping Websites” by Arunesh Mathur et al.

And they decided to introduce new Dark patterns such as Social Proof, Scarcity and Urgency.

In his extensive Medium publication What Princeton University researchers got wrong about Dark Patterns”, Jochen T. Grünbeck explains what is wrong with this research document.

The Authors mention that in reality just 183 out of 11000 websites really use deceptive patterns, which is a mere 1,66%.

So why not calling it “Less than 2% websites use deceptive practices”? Did they authors want to use FEAR and AUTHORITY to promote their own publication? 

FEAR is actually a very strong Dark pattern frequently used to manipulate populations.

AUTHORITY is another potential Dark pattern: dress like surgeon and I will trust can you are a Doctor. Use your Title or the name of your School and I will believe what you write or say.

Additionally, it’s interesting to note that the authors are influenced by the confirmation bias: at least 3 of the Study’s Authors (Arunesh Mathur, Arvind Narayanan and Günes Acar) have been dealing with investigating companies’ data collection, data use, and deceptive practices, and they interpret information to support their existing belief.

Is Social Proof a Dark Pattern?

Arunesh Mathur et al. classify Social Proof as a Dark pattern and describe it as “Influencing users’ behavior by describing the experiences and behavior of other users.” 

To quote Wikipedia, Social proof is a psychological and social phenomenon wherein people copy the actions of others in an attempt to undertake behavior in a given situation.

Think about all the situations in life where you encounter Social Proof.

Credit: Austin Brown/Bloomberg

If Social Proof was a Dark pattern, people should stop queuing in front of restaurants and we should also stop watching them.

The way most e-commerce and online businesses use Social Proof is honest (remember, only 1.66% use deceptive practices according to the survey). That means that they show you real numbers and real reviews from real customers. Just like the things you can see easily everyday in the street.

Social Proof App like Nudgify have made clear statements about Dark patterns and especially Fake Social Proof in their Ethical Social Proof publication.

When used incorrectly, Social proof can become a Dark pattern, but in most cases it’s based on genuine information, hence not deceptive.

How powerful are Dark patterns vs Social Proof

In a more recent publication in the Journal of Legal Analysis: Shining a Light on Dark Patterns – 23 March 2021, Jamie Luguri and Lior Jacob Strahilevitz wanted to have evidence of the power of dark patterns.

So they compiled the Dark patterns identified by a few researchers including the one from Princeton: Social Proof, Scarcity and Urgency.

This is their “summary of existing dark pattern” taxonomies:

CategoryVariantDescriptionSource
Nagging  Repeated requests to do something the firm prefers Gray et al. (2018) 
Social proof  Activity messages False/misleading Notice that others are purchasing, contributing Mathur et al. (2019)  
Testimonials False/misleading positive statements from customers Mathur et al. (2019) 
Obstruction  Roach motel Asymmetry between signing up and canceling Gray et al. (2018), Mathur et al. (2019) 
Price comparison prevention Frustrates comparison shopping Brignull (2020), Gray et al. (2018), Mathur et al. (2019)  
Intermediate currency Purchases in virtual currency to obscure cost Brignull (2020) 
Immortal accounts Account and consumer info cannot be deleted Bösch et al. (2016) 
Sneaking  Sneak into basket Item consumer did not add is in cart Brignull (2020), Gray et al. (2018), Mathur et al. (2019) 
Hidden costs Costs obscured/disclosed late in transaction Brignull (2020), Gray et al. (2018), Mathur et al. (2019) 
Hidden subscription/forced continuity Unanticipated/undesired automatic renewal Brignull (2020), Gray et al. (2018), Mathur et al. (2019) 
Bait and switch Customer sold something other than what’s originally advertised Gray et al. (2018) 
Interface interference  Hidden information/aesthetic manipulation Important information visually obscured Gray et al. (2018) 
Preselection Firm-friendly default is preselected Bösch et al. (2016), Gray et al. (2018) 
Toying with emotion Emotionally manipulative framing Gray et al. (2018) 
False hierarchy/pressured selling Manipulation to select more expensive version Gray et al. (2018), Mathur et al. (2019) 
Trick questions Intentional or obvious ambiguity Gray et al. (2018), Mathur et al. (2019) 
Disguised ad Consumer induced to click on something that isn’t apparent ad Brignull (2020), Gray et al. (2018) 
Confirmshaming Choice framed in a way that makes it seem dishonorable, stupid Brignull (2020), Mathur et al. (2019) 
Cuteness Consumers likely to trust attractive robot Cherie & Catherine (2019) 
Forced action  Friend spam/social pyramid/address book leeching Manipulative extraction of information about other users Brignull (2020), Bösch et al. (2016), Gray et al. (2018) 
Privacy Zuckering Consumers tricked into sharing personal info Brignull (2020), Bösch et al. (2016), Gray et al. (2018) 
Gamification Features earned through repeated use Gray et al. (2018) 
Forced Registration Consumer tricked into thinking registration necessary Bösch et al. (2016) 
Scarcity  Low stock message Consumer informed of limited quantities Mathur et al. (2019) 
High demand message Consumer informed others are buying remaining stock Mathur et al. (2019) 
Urgency  Countdown timer Opportunity ends soon with blatant visual cue Mathur et al. (2019) 
Limited time message Opportunity ends soon Mathur et al. (2019) 

And they also mention this:

In our revised taxonomy we have been more careful than the existing literature to indicate that social proof (activity messages and testimonials) and urgency (low stock/high demand/limited time messages) are only dark patterns insofar as the information conveyed is false or misleading. If a consumer is happy with a product and provides a favorable quote about it, it is not a dark pattern to use that quote in online marketing, absent a showing that it is misleadingly atypical.

In order to assess the power of each pattern, Luguri & Strahilevitz decided to conduct A/B Testing with a sample of 1,963 participants.

They tested how participants accepted to give their consent by introducing different patterns in different variations.

Here are the A/B Tests Statistics for their Study 2, that did not contain any aggressive dark pattern:

Acceptance rates by content and form condition

 ControlRecommendedDefaultObstruction
Control 13.215.11519.5
p  = 0.46 p  = 0.49 p  = 0.03 
Scarcity 10.610.818.917.4
p  = 0.39 p  = .41 p  = .061 p  = 0.19 
Confirmshaming 20.5% 16.4% 21.0% 20.4
p  = 0.02 p  = .29 p  = 0.012 p  = 0.03 
Social proof 192121.427.9
p  = 0.053 p  = .01 p  = 0.009 p  < 0.001 
Uplfit43%39%42%43%
Hidden information 30.828.726.734.5
p  < 0.001 p  < 0.001 p  < 0.001 p  < 0.001 

Dark patterns such as “Hidden information” increased the Acceptance rate by more than 77%, while Social Proof increased the Acceptance rate by 39% to 43%.

So Dark patterns are indeed very powerful, it’s also why they should be monitored and eventually regulated.

But Social Proof is also very powerful: it can increase your conversion rates by up to 43% according to the A/B tests conducted by Luguri & Strahilevitz.

How can you apply Social Proof to your website?

In these articles, you’ll find ways to use Social Proof on your website:

Then you can also use some Social Proof Apps such as Nudgify.

Nudgify turns your data into Social Proof. It’s the easiest way to add Social Proof to any page of your website.

The whole concept of Nudgify is based on Nudge Marketing.

You can show recent activity, such as page views and recent sales, on any page. You can also showcase your best testimonials from Trustpilot, Google Reviews or Capterra. However, Nudgify also allows you to show when stock is running low, when a deal is ending and how much a customer needs to spend to qualify for free delivery.

Nudgify wants to help any business to re-create real-life experience.

Final Words: How Not To Use Social Proof

In ContagiousJonah Berger explains that things that are public and visible by others, are more likely to spreadFor example, people are more likely to go into restaurants that are full than those that are empty. That’s Social Influence, or Social Proof.

However, Social Influence works both ways and can also backfire. While the Nancy Reagan’s “Just say no” anti-drug campaign aimed at reducing drug use by American children, it actually increased it because it made things more public.

Negative Social Proof

You need to be careful when using negative examples, it can unintentionally encourage negative behavior.

The theory behind negative social proof is that, if we see that a lot of people did something “bad,” we can be encouraged to do the opposite. 

In practice, negative social proof makes “bad things” visible an encourage other people who didn’t even think about it before, to the same! Negative social proof can potentially makes “bad things”, normal.

Bad Reviews

Companies and Marketers obviously don’t like to see see bad reviews. They worry about the consequences on their business. Which is legitimate since 67% of Consumers are influenced by online reviews.

But did you know that bad reviews also help to build trust.

According to Reevo, “68% of people trust reviews more when they see both good and bad scores. They don’t look at reviews in isolation; they definitely notice – and become suspicious – if there are no bad reviews.”

That doesn’t mean you should show only you bad reviews. Just feature positive reviews more prominently within your content.

Social Counters

The purpose of social sharing buttons is not only to help readers to share your content – they also show how many people have already shared that content. 

The use of sharing buttons with counters is usually a good practice, but they can backfire. If someone can see that a content or product they wanted to share has 0 Share, they will be less likely to do so.

Avoid using social proof sharing buttons until your site has many shares and customers regularly sharing your content.

Also replace the counter by a message when there is 0 share. So that you encourage users to do so and turn it in a positive way: “be the first to share this content”.

by Philippe Aimé

Philippe is the CEO of Convertize. He created his first website in 1998 and has spent the past 20 years finding ways to make digital marketing more persuasive. He now heads a team of CRO consultants.

Newsletter signup

Philippe is the CEO of Convertize. He created his first website in 1998 and has spent the past 20 years finding ways to make digital marketing more persuasive. He now heads a team of CRO consultants.

Subscribe to get his latest articles FREE