The Psychology of Dark Patterns: What Every Internet User Should Know
During your online adventures, you’ve probably already felt like you were making decisions that were completely different from what you originally intended. That feeling isn’t just in your head—and it’s not by chance either. That’s exactly what we’re diving into today.
Understand
Oct 16, 2025
8 min



What’s Going on Here?
B.J. Fogg, a PhD and professor in behavioral sciences, is the founder of the Stanford Behavior Design Lab.
There, he teaches a course called “Captology” (Computers As Persuasive Technologies). It’s a field that studies how digital technologies influence our thoughts and behaviors. Many of his former students now hold executive roles at tech giants like Meta and Google. One of them is even Mike Krieger, co-founder of Instagram. Surprised?
One of the methods they study is persuasive design—a technique used to guide user choices and behaviors through digital interfaces or products. At first glance, it doesn’t seem inherently evil. But the ethical line starts to blur as soon as it begins to undermine our ability to make free choices.
The term “dark pattern” was coined in 2010 by Harry Brignull. They’re defined as follows:
“Tricks used in websites and apps that make you do things that you didn’t intend to.”

Dark Pattern
These are intentional strategies designed to get us to make choices that aren’t in our best interest—but that benefit the company behind them. Here are 5 common examples of dark patterns:
Bait and switch: you expect one result when clicking a button or link, but you’re taken somewhere else entirely.
Forced continuity: your free trial quietly turns into a paid subscription—without clear notice.
Hidden costs: extra charges pop up during checkout, only revealed at the last moment—after you’ve already committed.
Privacy Zuckering: nudging users into sharing more personal data than they normally would (named after Mark Zuckerberg and Meta’s practices).
Roach Motel: it’s easy to sign up, but frustratingly difficult to cancel or unsubscribe.
Hard to list them all here without losing your precious attention. But you can check out the rest right here.
Shady manipulation techniques have always existed. But they’ve reached a new level in the digital age. Their increasing subtlety and omnipresence make them a serious issue in our daily relationship with tech.
Dark patterns are, in fact, the secret sauce of all addictive technologies. Take infinite scroll, for instance: a feature designed to capture our attention far longer than the old-school page-by-page setup.

The sneakiness, as we’ll see, lies in how these intentional designs exploit the flaws in human psychology.
Vulnerable Beings
Sean Parker, Facebook’s first major player (before being fired in 2005 for cocaine possession), said this about the “like” button introduced in 2009:
“It’s a social-validation feedback loop… exactly the kind of thing a hacker like me would come up with, because it exploits a vulnerability in human psychology.”
The like button was a direct hit on our need for social validation, nudging us to interact more—and come back more often. This example shows why some dark patterns are so effective: our brains haven’t evolved much since the Paleolithic era. But our understanding of how they work has grown—especially when it comes to cognitive biases, those automatic and unconscious decision-making shortcuts.
These biases help the brain save time and energy by creating “mental shortcuts.” As a result, our online behavior is—to some extent—predictable. These biases, though handy in certain contexts, drastically simplify our decision-making process. And dark patterns take full advantage of these psychological vulnerabilities.
Here’s a classic example: in 2010, as an April Fool’s prank, the platform GameStation snuck an outrageous clause into their terms and conditions, just to prove how few people actually read them. By agreeing to the terms, you were giving them the right to own your soul. No joke.

And I’ll admit it—I’d have given it to them. Just like the 7,500 others who agreed without hesitation before the prank was revealed.
While researching, I came across a site called FairPatterns—worth checking out if you want to go deeper into this topic. According to them, there are over 180 cognitive biases. That makes it a wildly uneven playing field: experts in neuroscience and behavioral science are working full-time to test and refine the best tactics for influencing us—by tapping into our weaknesses. Sometimes it’s not even subtle. It sounds like a comedy sketch, but Dopamine Labs recently offered this to their clients:
“Connect your app to our persuasion AI, and increase engagement and revenue by 30%, by giving your users our perfect dopamine hits.”

Shady marketing tricks are nothing new. But the digital era changed the game: algorithms now adapt in real time to our behavior and reactions. As neuroscience advances, manipulation techniques are getting sharper, more complex, and harder to detect. Still, I found myself wondering about boundaries.
From the company’s side: where do you draw the line between a clever business tactic and a dark pattern?
From the user’s side: what’s the line between inattention and actual manipulation?

Design: Jedi or Sith?
It’s a big grey area, and not everyone’s on equal footing. And to be fair, not all behavioral nudging is bad. Take “nudges,” for example—subtle design tweaks that promote positive behaviors.
In the 90s, Amsterdam’s Schiphol Airport managed to cut men’s bathroom cleaning costs by 70% just by placing a fly sticker in the urinals. A simple challenge to help users aim better.

It hit the mark.
What’s Going on Here?
B.J. Fogg, a PhD and professor in behavioral sciences, is the founder of the Stanford Behavior Design Lab.
There, he teaches a course called “Captology” (Computers As Persuasive Technologies). It’s a field that studies how digital technologies influence our thoughts and behaviors. Many of his former students now hold executive roles at tech giants like Meta and Google. One of them is even Mike Krieger, co-founder of Instagram. Surprised?
One of the methods they study is persuasive design—a technique used to guide user choices and behaviors through digital interfaces or products. At first glance, it doesn’t seem inherently evil. But the ethical line starts to blur as soon as it begins to undermine our ability to make free choices.
The term “dark pattern” was coined in 2010 by Harry Brignull. They’re defined as follows:
“Tricks used in websites and apps that make you do things that you didn’t intend to.”

Dark Pattern
These are intentional strategies designed to get us to make choices that aren’t in our best interest—but that benefit the company behind them. Here are 5 common examples of dark patterns:
Bait and switch: you expect one result when clicking a button or link, but you’re taken somewhere else entirely.
Forced continuity: your free trial quietly turns into a paid subscription—without clear notice.
Hidden costs: extra charges pop up during checkout, only revealed at the last moment—after you’ve already committed.
Privacy Zuckering: nudging users into sharing more personal data than they normally would (named after Mark Zuckerberg and Meta’s practices).
Roach Motel: it’s easy to sign up, but frustratingly difficult to cancel or unsubscribe.
Hard to list them all here without losing your precious attention. But you can check out the rest right here.
Shady manipulation techniques have always existed. But they’ve reached a new level in the digital age. Their increasing subtlety and omnipresence make them a serious issue in our daily relationship with tech.
Dark patterns are, in fact, the secret sauce of all addictive technologies. Take infinite scroll, for instance: a feature designed to capture our attention far longer than the old-school page-by-page setup.

The sneakiness, as we’ll see, lies in how these intentional designs exploit the flaws in human psychology.
Vulnerable Beings
Sean Parker, Facebook’s first major player (before being fired in 2005 for cocaine possession), said this about the “like” button introduced in 2009:
“It’s a social-validation feedback loop… exactly the kind of thing a hacker like me would come up with, because it exploits a vulnerability in human psychology.”
The like button was a direct hit on our need for social validation, nudging us to interact more—and come back more often. This example shows why some dark patterns are so effective: our brains haven’t evolved much since the Paleolithic era. But our understanding of how they work has grown—especially when it comes to cognitive biases, those automatic and unconscious decision-making shortcuts.
These biases help the brain save time and energy by creating “mental shortcuts.” As a result, our online behavior is—to some extent—predictable. These biases, though handy in certain contexts, drastically simplify our decision-making process. And dark patterns take full advantage of these psychological vulnerabilities.
Here’s a classic example: in 2010, as an April Fool’s prank, the platform GameStation snuck an outrageous clause into their terms and conditions, just to prove how few people actually read them. By agreeing to the terms, you were giving them the right to own your soul. No joke.

And I’ll admit it—I’d have given it to them. Just like the 7,500 others who agreed without hesitation before the prank was revealed.
While researching, I came across a site called FairPatterns—worth checking out if you want to go deeper into this topic. According to them, there are over 180 cognitive biases. That makes it a wildly uneven playing field: experts in neuroscience and behavioral science are working full-time to test and refine the best tactics for influencing us—by tapping into our weaknesses. Sometimes it’s not even subtle. It sounds like a comedy sketch, but Dopamine Labs recently offered this to their clients:
“Connect your app to our persuasion AI, and increase engagement and revenue by 30%, by giving your users our perfect dopamine hits.”

Shady marketing tricks are nothing new. But the digital era changed the game: algorithms now adapt in real time to our behavior and reactions. As neuroscience advances, manipulation techniques are getting sharper, more complex, and harder to detect. Still, I found myself wondering about boundaries.
From the company’s side: where do you draw the line between a clever business tactic and a dark pattern?
From the user’s side: what’s the line between inattention and actual manipulation?

Design: Jedi or Sith?
It’s a big grey area, and not everyone’s on equal footing. And to be fair, not all behavioral nudging is bad. Take “nudges,” for example—subtle design tweaks that promote positive behaviors.
In the 90s, Amsterdam’s Schiphol Airport managed to cut men’s bathroom cleaning costs by 70% just by placing a fly sticker in the urinals. A simple challenge to help users aim better.

It hit the mark.
What’s Going on Here?
B.J. Fogg, a PhD and professor in behavioral sciences, is the founder of the Stanford Behavior Design Lab.
There, he teaches a course called “Captology” (Computers As Persuasive Technologies). It’s a field that studies how digital technologies influence our thoughts and behaviors. Many of his former students now hold executive roles at tech giants like Meta and Google. One of them is even Mike Krieger, co-founder of Instagram. Surprised?
One of the methods they study is persuasive design—a technique used to guide user choices and behaviors through digital interfaces or products. At first glance, it doesn’t seem inherently evil. But the ethical line starts to blur as soon as it begins to undermine our ability to make free choices.
The term “dark pattern” was coined in 2010 by Harry Brignull. They’re defined as follows:
“Tricks used in websites and apps that make you do things that you didn’t intend to.”

Dark Pattern
These are intentional strategies designed to get us to make choices that aren’t in our best interest—but that benefit the company behind them. Here are 5 common examples of dark patterns:
Bait and switch: you expect one result when clicking a button or link, but you’re taken somewhere else entirely.
Forced continuity: your free trial quietly turns into a paid subscription—without clear notice.
Hidden costs: extra charges pop up during checkout, only revealed at the last moment—after you’ve already committed.
Privacy Zuckering: nudging users into sharing more personal data than they normally would (named after Mark Zuckerberg and Meta’s practices).
Roach Motel: it’s easy to sign up, but frustratingly difficult to cancel or unsubscribe.
Hard to list them all here without losing your precious attention. But you can check out the rest right here.
Shady manipulation techniques have always existed. But they’ve reached a new level in the digital age. Their increasing subtlety and omnipresence make them a serious issue in our daily relationship with tech.
Dark patterns are, in fact, the secret sauce of all addictive technologies. Take infinite scroll, for instance: a feature designed to capture our attention far longer than the old-school page-by-page setup.

The sneakiness, as we’ll see, lies in how these intentional designs exploit the flaws in human psychology.
Vulnerable Beings
Sean Parker, Facebook’s first major player (before being fired in 2005 for cocaine possession), said this about the “like” button introduced in 2009:
“It’s a social-validation feedback loop… exactly the kind of thing a hacker like me would come up with, because it exploits a vulnerability in human psychology.”
The like button was a direct hit on our need for social validation, nudging us to interact more—and come back more often. This example shows why some dark patterns are so effective: our brains haven’t evolved much since the Paleolithic era. But our understanding of how they work has grown—especially when it comes to cognitive biases, those automatic and unconscious decision-making shortcuts.
These biases help the brain save time and energy by creating “mental shortcuts.” As a result, our online behavior is—to some extent—predictable. These biases, though handy in certain contexts, drastically simplify our decision-making process. And dark patterns take full advantage of these psychological vulnerabilities.
Here’s a classic example: in 2010, as an April Fool’s prank, the platform GameStation snuck an outrageous clause into their terms and conditions, just to prove how few people actually read them. By agreeing to the terms, you were giving them the right to own your soul. No joke.

And I’ll admit it—I’d have given it to them. Just like the 7,500 others who agreed without hesitation before the prank was revealed.
While researching, I came across a site called FairPatterns—worth checking out if you want to go deeper into this topic. According to them, there are over 180 cognitive biases. That makes it a wildly uneven playing field: experts in neuroscience and behavioral science are working full-time to test and refine the best tactics for influencing us—by tapping into our weaknesses. Sometimes it’s not even subtle. It sounds like a comedy sketch, but Dopamine Labs recently offered this to their clients:
“Connect your app to our persuasion AI, and increase engagement and revenue by 30%, by giving your users our perfect dopamine hits.”

Shady marketing tricks are nothing new. But the digital era changed the game: algorithms now adapt in real time to our behavior and reactions. As neuroscience advances, manipulation techniques are getting sharper, more complex, and harder to detect. Still, I found myself wondering about boundaries.
From the company’s side: where do you draw the line between a clever business tactic and a dark pattern?
From the user’s side: what’s the line between inattention and actual manipulation?

Design: Jedi or Sith?
It’s a big grey area, and not everyone’s on equal footing. And to be fair, not all behavioral nudging is bad. Take “nudges,” for example—subtle design tweaks that promote positive behaviors.
In the 90s, Amsterdam’s Schiphol Airport managed to cut men’s bathroom cleaning costs by 70% just by placing a fly sticker in the urinals. A simple challenge to help users aim better.

It hit the mark.

Your phone, your rules. Block on command and own your time.
For 30min
Everyday
On weekends
During workhours
From 10 pm to 8 am
For 7 days
All the time

Your phone, your rules. Block on command and own your time.
For 30min
Everyday
On weekends
During workhours
From 10 pm to 8 am
For 7 days
All the time

Your phone, your rules. Block on command and own your time.
For 30min
Everyday
On weekends
During workhours
From 10 pm to 8 am
For 7 days
All the time
Hooking Attention
Some dark patterns are specifically designed to capture your attention—to make a product sticky and addictive. Nir Eyal, a former student of B.J. Fogg, published Hooked in 2013. In it, he outlines a 4-step cycle to help products build user habits: trigger – action – variable reward – investment

Trigger: the thing that sparks usage. It can be external (like a notification) or internal (a feeling or thought).
Action: the behavior performed in response, in anticipation of a reward.
Variable reward: the benefit from that action, delivered in an unpredictable way—just like a slot machine. That unpredictability is what reinforces the behavior.
Investment: the user puts something in—time, data, effort, social engagement, or money—which increases the chance they’ll return, thanks to the “endowment effect.”
Over time, the brain runs this loop on autopilot. That’s when we lose control. Let’s use Instagram as an example of this cycle.

Trigger: I get a notification (external) / I feel bored or need social validation (internal).
Action: I open the app, watch some stories, maybe post something.
Variable reward: I don’t know what’ll show up in my feed, or who liked my story. The uncertainty is key—I never know what dopamine hit I’m going to get.
Investment: I invest time and energy posting, scrolling, interacting.
Then the loop starts again—with more and more compulsion.
A Future of Refinement And Regulation
The most blatant dark patterns that were once common are becoming rare. But don’t be fooled—they’re not going away. They’re baked into the attention economy. The rapid rise of AI will only amplify captology: more personalization, more nuance.
The battle for attention keeps escalating—and more sophisticated tactics are bound to follow. You can count on their creativity. Still, some institutions are starting to take this seriously.
EpicGames, the company behind Fortnite, was fined $520 million by the Federal Trade Commission (FTC) for using dark patterns that nudged players into unwanted in-game purchases.

In June 2023, the FTC also sued Amazon for “tricking millions of customers” into renewing Amazon Prime subscriptions without clear consent. The message is clear: people want fewer paternalistic tricks.
In Europe, the Digital Services Act (DSA) came into effect for Big Tech (GAFAM) in August 2023, and for all digital platforms in February 2024. One of its key measures? A ban on dark patterns—backed by steep fines. But until stronger collective action is in place, we need to do the work individually.
Stay Vigilant
Jean-Claude Van Damme had it right: “The key is to be aware.” That’s the goal of Jomo—to raise awareness. This edition is all about recognizing these dark arts that often work against your best interest.
In his book Thinking, Fast and Slow, Daniel Kahneman describes two systems in the brain:
System 1: fast, instinctive, and emotional
System 2: slower, more deliberate, and logical
With System 1, our brain runs on autopilot. We skim. We rely on default behaviors. That’s what dark patterns target—System 1, which is constantly fueled by the attention economy. But by consciously activating System 2, we can better resist their pull.
Next time you use an app or digital product, try being more mindful:
Why are you making the choices you’re making?
What’s influencing your behavior?
Are you making an informed decision?
In short, the key is awareness and mindfulness. To bring back a little slow attention. Think before you click or scroll. Watch out for fine print and asterisks. A real challenge—given how our digital habits push us to move faster and faster.
Act Before it’s Too Late
If you’re looking for help—and a little push—to disconnect, we recommend the Jomo app. It’s available for free on iPhone, iPad, and Mac.
With Jomo, the goal isn’t to ban phone use altogether, but to use it more intentionally and mindfully. To stay in control—and not become a modern slave to that little device in your pocket.
One of our favorite features in Jomo is a rule called “Conscious Use”, and it works incredibly well. The idea is simple: by default, your distracting apps are blocked. To use them, you’ll need to ask Jomo for permission.
You’ll be prompted to explain why you want to use the app—and for how long. It’s a powerful way to create distance without frustration, and to regulate your screen time without going cold turkey.
Here’s how to get started:
Go to the “Rules” tab
Scroll down to the “Templates” section and tap “Conscious Use”
Add the apps you want to block, and you’re all set!


Hooking Attention
Some dark patterns are specifically designed to capture your attention—to make a product sticky and addictive. Nir Eyal, a former student of B.J. Fogg, published Hooked in 2013. In it, he outlines a 4-step cycle to help products build user habits: trigger – action – variable reward – investment

Trigger: the thing that sparks usage. It can be external (like a notification) or internal (a feeling or thought).
Action: the behavior performed in response, in anticipation of a reward.
Variable reward: the benefit from that action, delivered in an unpredictable way—just like a slot machine. That unpredictability is what reinforces the behavior.
Investment: the user puts something in—time, data, effort, social engagement, or money—which increases the chance they’ll return, thanks to the “endowment effect.”
Over time, the brain runs this loop on autopilot. That’s when we lose control. Let’s use Instagram as an example of this cycle.

Trigger: I get a notification (external) / I feel bored or need social validation (internal).
Action: I open the app, watch some stories, maybe post something.
Variable reward: I don’t know what’ll show up in my feed, or who liked my story. The uncertainty is key—I never know what dopamine hit I’m going to get.
Investment: I invest time and energy posting, scrolling, interacting.
Then the loop starts again—with more and more compulsion.
A Future of Refinement And Regulation
The most blatant dark patterns that were once common are becoming rare. But don’t be fooled—they’re not going away. They’re baked into the attention economy. The rapid rise of AI will only amplify captology: more personalization, more nuance.
The battle for attention keeps escalating—and more sophisticated tactics are bound to follow. You can count on their creativity. Still, some institutions are starting to take this seriously.
EpicGames, the company behind Fortnite, was fined $520 million by the Federal Trade Commission (FTC) for using dark patterns that nudged players into unwanted in-game purchases.

In June 2023, the FTC also sued Amazon for “tricking millions of customers” into renewing Amazon Prime subscriptions without clear consent. The message is clear: people want fewer paternalistic tricks.
In Europe, the Digital Services Act (DSA) came into effect for Big Tech (GAFAM) in August 2023, and for all digital platforms in February 2024. One of its key measures? A ban on dark patterns—backed by steep fines. But until stronger collective action is in place, we need to do the work individually.
Stay Vigilant
Jean-Claude Van Damme had it right: “The key is to be aware.” That’s the goal of Jomo—to raise awareness. This edition is all about recognizing these dark arts that often work against your best interest.
In his book Thinking, Fast and Slow, Daniel Kahneman describes two systems in the brain:
System 1: fast, instinctive, and emotional
System 2: slower, more deliberate, and logical
With System 1, our brain runs on autopilot. We skim. We rely on default behaviors. That’s what dark patterns target—System 1, which is constantly fueled by the attention economy. But by consciously activating System 2, we can better resist their pull.
Next time you use an app or digital product, try being more mindful:
Why are you making the choices you’re making?
What’s influencing your behavior?
Are you making an informed decision?
In short, the key is awareness and mindfulness. To bring back a little slow attention. Think before you click or scroll. Watch out for fine print and asterisks. A real challenge—given how our digital habits push us to move faster and faster.
Act Before it’s Too Late
If you’re looking for help—and a little push—to disconnect, we recommend the Jomo app. It’s available for free on iPhone, iPad, and Mac.
With Jomo, the goal isn’t to ban phone use altogether, but to use it more intentionally and mindfully. To stay in control—and not become a modern slave to that little device in your pocket.
One of our favorite features in Jomo is a rule called “Conscious Use”, and it works incredibly well. The idea is simple: by default, your distracting apps are blocked. To use them, you’ll need to ask Jomo for permission.
You’ll be prompted to explain why you want to use the app—and for how long. It’s a powerful way to create distance without frustration, and to regulate your screen time without going cold turkey.
Here’s how to get started:
Go to the “Rules” tab
Scroll down to the “Templates” section and tap “Conscious Use”
Add the apps you want to block, and you’re all set!


Hooking Attention
Some dark patterns are specifically designed to capture your attention—to make a product sticky and addictive. Nir Eyal, a former student of B.J. Fogg, published Hooked in 2013. In it, he outlines a 4-step cycle to help products build user habits: trigger – action – variable reward – investment

Trigger: the thing that sparks usage. It can be external (like a notification) or internal (a feeling or thought).
Action: the behavior performed in response, in anticipation of a reward.
Variable reward: the benefit from that action, delivered in an unpredictable way—just like a slot machine. That unpredictability is what reinforces the behavior.
Investment: the user puts something in—time, data, effort, social engagement, or money—which increases the chance they’ll return, thanks to the “endowment effect.”
Over time, the brain runs this loop on autopilot. That’s when we lose control. Let’s use Instagram as an example of this cycle.

Trigger: I get a notification (external) / I feel bored or need social validation (internal).
Action: I open the app, watch some stories, maybe post something.
Variable reward: I don’t know what’ll show up in my feed, or who liked my story. The uncertainty is key—I never know what dopamine hit I’m going to get.
Investment: I invest time and energy posting, scrolling, interacting.
Then the loop starts again—with more and more compulsion.
A Future of Refinement And Regulation
The most blatant dark patterns that were once common are becoming rare. But don’t be fooled—they’re not going away. They’re baked into the attention economy. The rapid rise of AI will only amplify captology: more personalization, more nuance.
The battle for attention keeps escalating—and more sophisticated tactics are bound to follow. You can count on their creativity. Still, some institutions are starting to take this seriously.
EpicGames, the company behind Fortnite, was fined $520 million by the Federal Trade Commission (FTC) for using dark patterns that nudged players into unwanted in-game purchases.

In June 2023, the FTC also sued Amazon for “tricking millions of customers” into renewing Amazon Prime subscriptions without clear consent. The message is clear: people want fewer paternalistic tricks.
In Europe, the Digital Services Act (DSA) came into effect for Big Tech (GAFAM) in August 2023, and for all digital platforms in February 2024. One of its key measures? A ban on dark patterns—backed by steep fines. But until stronger collective action is in place, we need to do the work individually.
Stay Vigilant
Jean-Claude Van Damme had it right: “The key is to be aware.” That’s the goal of Jomo—to raise awareness. This edition is all about recognizing these dark arts that often work against your best interest.
In his book Thinking, Fast and Slow, Daniel Kahneman describes two systems in the brain:
System 1: fast, instinctive, and emotional
System 2: slower, more deliberate, and logical
With System 1, our brain runs on autopilot. We skim. We rely on default behaviors. That’s what dark patterns target—System 1, which is constantly fueled by the attention economy. But by consciously activating System 2, we can better resist their pull.
Next time you use an app or digital product, try being more mindful:
Why are you making the choices you’re making?
What’s influencing your behavior?
Are you making an informed decision?
In short, the key is awareness and mindfulness. To bring back a little slow attention. Think before you click or scroll. Watch out for fine print and asterisks. A real challenge—given how our digital habits push us to move faster and faster.
Act Before it’s Too Late
If you’re looking for help—and a little push—to disconnect, we recommend the Jomo app. It’s available for free on iPhone, iPad, and Mac.
With Jomo, the goal isn’t to ban phone use altogether, but to use it more intentionally and mindfully. To stay in control—and not become a modern slave to that little device in your pocket.
One of our favorite features in Jomo is a rule called “Conscious Use”, and it works incredibly well. The idea is simple: by default, your distracting apps are blocked. To use them, you’ll need to ask Jomo for permission.
You’ll be prompted to explain why you want to use the app—and for how long. It’s a powerful way to create distance without frustration, and to regulate your screen time without going cold turkey.
Here’s how to get started:
Go to the “Rules” tab
Scroll down to the “Templates” section and tap “Conscious Use”
Add the apps you want to block, and you’re all set!


Credits
This article is a revised version of Edition #27 of the Screenbreak newsletter created by Julien Rousset. With his permission, we're sharing this high-quality content with you today! So many thanks to Julien. 😌
Photographies by Unsplash, Dall-e, ScreenBreak and the Internet.
[1] Fairpatterns
[2] Design de l’attention : quand le design influence nos comportements, Le Bon Digital, 2020.
[3] Fussell - The Endless, Invisible Persuasion Tactics of the Internet, The Atlantic, 2019.
[4] Pineau - Digital Services Act et interdiction des dark patterns : un nouveau paradigme pour le design des interfaces numériques, Alternatives Economiques, 2023.
[5] Kendrick - What Makes a Dark UI Pattern?, NN Group.
[6] Vaz - Dark design patterns — the day of reckoning is coming, Medium, 2023.
[7] Types of deceptive pattern, Deceptive Patterns.
[8] Persuasive Design and the Attention Economy, bblewrap, 2023.
Continue reading
Continue reading
The Joy Of Missing Out

Crafted in Europe
All rights reserved to Jomo SAS, 2025
The Joy Of Missing Out

Crafted in Europe
All rights reserved to Jomo SAS, 2025
The Joy Of Missing Out

Crafted in Europe
All rights reserved to Jomo SAS, 2025