We Design for Delight. We Forgot to Design for Safety
The Numbers Don't Lie - But They Do Mislead
Here's a statistic that gets thrown around a lot in cybersecurity circles: 80-90% of all data breaches involve human error.
Think about that for a moment. Not sophisticated state-sponsored hacking. Not zero-day exploits. Not AI-powered attacks. The overwhelming majority of breaches happen because a real person, just like you or me clicked the wrong link, reused a password, or trusted the wrong email.
We've all seen it. A colleague clicks a phishing email that looks exactly like a Microsoft login page. Someone uses the same password across their bank, their email, and their Netflix account and one breach unlocks everything. A person connects to a coffee shop's free WiFi and logs into their bank account without a second thought.
These aren't stupid people. They're normal people navigating systems that were never designed to protect them.
And that's exactly the problem.
We Design for Delight. We Forgot to Design for Safety.
I've spent over a decade in product design. I've obsessed over button states, onboarding flows, empty states, micro-interactions shaving friction from every step of a user journey. We talk endlessly about delight, engagement, and conversion.
But there's one area where our industry has consistently looked away.
Security.
And that's not an accident. It's a design choice.
We Blame the User. We Shouldn't.
Every time someone falls for a phishing email, uses a weak password, or accidentally grants an app full access to their contacts, we call it user error.
I call it a design failure.
Because using digital systems securely requires a level of prior technical knowledge that most people simply don't have and shouldn't need to have. We're asking everyday users - people of all ages, backgrounds, and literacy levels to understand 2FA, spot sophisticated phishing attempts, manage permissions across dozens of apps, and make informed decisions about their data.
That is an extraordinary and deeply unfair expectation.
When a user fails at security, the right question isn't "why didn't they know better?" It's "why did we build something that required them to?"
We Design Everything Else for Humans
Think about how much care goes into a good onboarding experience. Every word is tested. Every step is considered. We remove every unnecessary friction point because we know that confusion kills conversion.
Now open your phone's privacy settings.
Suddenly complexity is acceptable. Jargon is fine. Walls of text are normal. Options are buried four levels deep in menus nobody opens. Security features that should be on by default are opt-in - hidden behind toggles that most users will never find.
We apply world-class design thinking to make people spend more time on an app. We apply almost none of it to keeping those same people safe.
That double standard should bother every designer reading this.
The Dark Side of Our Own Superpower
Here's something designers don't say enough out loud: we are extraordinarily good at shaping human behaviour. Through visual hierarchy, choice architecture, defaults, and friction - we guide what people do, often without them realising it.
That power has been used brilliantly to drive engagement, retention, and growth.
It has also been used to manufacture consent through deliberately unreadable privacy policies, to bury opt-out options, and to make the "agree to everything" button always bigger, brighter, and easier to tap than the alternative.
That's dark UX. And many of us have built it, sometimes knowingly, sometimes under pressure, sometimes just following the brief without questioning it.
If we can use design to manipulate users into giving away their data, we can use the same design thinking to genuinely protect them. The capability exists. The question is whether the intention does.
What Good Security UX Actually Looks Like
It isn't more warnings. More popups. More red banners screaming at users.
Good security UX is simple, proactive, and as invisible as possible. It looks like defaults set to the most secure option, not the most convenient one for the business. Plain language that explains what a permission actually means in human terms. Proactive alerts that feel helpful, not alarming. Security flows designed with the same care as a checkout experience. Onboarding that builds safe habits from day one, not a settings page nobody visits.
Some of this exists in pockets. Apple's App Tracking Transparency prompt was a rare example of a major platform giving users a genuinely clear, simple choice. The industry reacted like it was a disaster. Users loved it.
That tells you everything about where the resistance to good security UX actually comes from.
The Designer's Responsibility
We are not just pixel pushers executing briefs. Every design decision we make shapes how millions of people experience the digital world.
When we design a consent flow that nobody reads - We made that choice. When we bury privacy controls - We made that choice. When we prioritize engagement metrics over user safety - we made that choice.
And we can make different ones.
Security doesn't have to be complex, intimidating, or ugly. It can be clear. It can be human. It can even be delightful. But only if designers decide it's worth their best thinking not just their compliance checkbox.
A Direct Challenge
To every designer, product lead, and founder reading this:
When did you last audit your product's security experience the way you audit your onboarding? When did you last run a usability test on your privacy settings? When did you last ask whether your consent flows are genuinely informed or just legally covered?
Safe should be the default. Simple should be non-negotiable. And no user should need a computer science degree to protect themselves on a platform you built.
We have the tools. We have the craft. Now we need the conviction.
90% of breaches involve human error. But humans didn't fail security - security failed humans. It's time designers fixed that!