While our culture of internet addiction and social gratification continues to worsen our mental health, how can designers adopt a more ethical approach to UX design that puts our user's best interests at heart?
Many claim that our smartphone-obsession and internet culture has created an unethical, unhealthy online environment, others argue that mobile experiences only benefit and improve our lives.
One thing everyone agrees on is that in our modern world, digital experiences are completely unavoidable and whether we’re talking about mobile apps, websites or other platforms, design can be used for good and for bad.
In this modern and complex society, where technology only creates more room for ethical dilemmas, we as designers have to think of logical ways to tackle ethical problems and spot these issues in the first place.
1. UX design & its impact on our mental health
2. User testing with dissimilar communities & cultures
3. Inclusive design & the importance of accessibility for all
4. Death in design: dealing with sensitive subjects
With the age of internet addiction and instant gratification only continuing to rise, it’s crucial that we not only understand, but also try to combat the negative impacts that online experiences and UX design can have on our user’s mental health.
Companies are using the same techniques used in gambling to facilitate releases of dopamine, promoting unhealthy addictions through a false happiness, be it through apps, websites, Alexa Skills or any other platform that provides a digital experience for users.
For me, one of the most ethically-interesting areas in design is social media, where it is common to see features that users not only want but expect, such as near-endless feeds, despite that these features have huge negative impacts on our mental health and well being.
From false notifications to hidden items in your basket and even purposefully misplaced unsubscribe buttons; there are a lot of dark patterns out there and every single one of them has a negative impact on mental health.
These patterns are perhaps most prevalent in the likes of social media and e-commerce platforms, but can be found in most of today’s digital experiences, where screen-time and monetisation models are all setup to keep us scrolling, tapping and swiping our lives away with no interest in the impact this has on us long-term.
Dark patterns aren’t just present in physical component-based features within an interface but also occur algorithmically. The algorithms that social networks, e-commerce sites and most digital platforms alike use to serve content to users are extremely clever but equally ethically challenging.
As you like, favorite, subscribe or even purchase on these platforms you are informing these algorithms of your interests. These algorithms can then serve you native adverts that can spookily show up across all your websites and apps, sound familiar?
These algorithms in turn form echo-chambers. The idea of an echo chamber is that as you subscribe to things you are interested in you slowly get served more and more content on this topic by these here algorithms.
Internet users can seek out interactions with like-minded individuals who have similar values, and thus become less likely to trust important decisions to people whose values differ from their own
The problem being you end up getting served with the most extreme views of that topic, as the media and products being served to you are coming from the powerhouses in charge of that sphere of media. This dictates a circle of influence that we end up following.
This isn’t helped by the fact the entire model of the like is designed to induce dopamine hits and this massively promotes media toxicity, as people’s need for approval, from those they aren’t close to or have never even met increases, and that’s not mentioning the fact that it opens a playground of cyberbullying and internet-trolls.
The worst part? Jakob’s Law states that we spend most of our time on other platforms so when a new platform arises we want the platform to be consistent with what we are used to, meaning we welcome these addictive dark patterns into new platforms and experiences.
This being said there are ways we can use such dark patterns for good. For example, promoting healthy eating, in what I call positive echo-chambers.
Source: Laws of UX.
Our first responsibility as designers to avoid dark patterns and unethical designs is to do our research into them and make sure we are aware of them, looking at resources such as Dark Patterns.
There’s a lot of design resources out there, one of the best resources I’ve come across is IDEO’s Human-Centred Design Toolkit, but I’ve always come back to the more bitesize Laws of UX; a set of guiding principles that outline ways design can be used to influence our users.
We can follow the guidelines from ethics communities such as AIGA, study cognitive behavioural science to better understand the psychological implications of our design. Such as the Hook Model and Manipulation Matrix, that provide designers with the power to build habit-forming products, as found in Nir Eyal’s book; Hooked.
We can perform ethical passes by asking ourselves questions about our designs, such as the ones outlined by the UX Collective on Dark Patterns:
User testing and user interviews are often the most important stage in digging out usability issues and ethical issues in a design. The problem is that dark patterns often go untraced by users in these situations as they don’t realise they are there. These dark patterns have become accepted and worse, expected by users because it is what they are used to.
I recently got to work on an inspiring project around mental health and mobile apps, where balancing business goals with the user’s best interests was an extremely difficult and ethically challenging thing to get my head around.
One such question I spent hours debating with myself was the use of gamification; creating an experience that I know for a fact would have a hugely positive impact on them, even if that experience itself used tactics often associated with dark patterns.
Or how we could scale the platform to reach as many users as possible and provide the best experience we could while finding the resource and budget that would make that very thing possible in the first place.
Most of these debates were internal and in actual fact, the choices we made with the client always had the user’s best interests at heart, but it did get me thinking about how other brands would handle the situation, and I’m sure the outcome would have been the polar opposite.
For me, tackling dark patterns means influencing designers and providing people with enough knowledge and skills to offer up ethical alternatives that can achieve the same outcomes
As designers, we’re faced with an abundance of hugely interesting challenges, especially in the tech for good sector with things like usability and user experience being absolutely paramount in creating digital products that can truly make a difference in the world.
But how do you design and user test an app for someone who has never used a phone before? Someone who doesn’t speak the same language as you, lives on the other side of the world or comes from a completely different culture you know very little about?
This is a challenge all too familiar to our team at CUBE, and it’s something that we’re always trying to learn more about and grow our knowledge and experience in.
For those that are using our apps who have very little tech proficiency or literacy levels, we have to ensure that as we develop new features we don’t massively change the app’s interface or information architecture otherwise the experience will become completely unintuitive.
Not to mention that these users also require training, so training materials have to be changed every-time there is a major design change, which can come at great expense and inconvenience to organisations using our apps. So we do everything we can to avoid this.
Something we have to avoid when on a project where we can’t perform a “Catastrophic Reset” or drastically change a user interface or experience is design debt.
Design debt is how we describe taking or building upon existing platforms with established designs, where features have been bolted on here and there and older versions of the designs get carried through, something that often occurs when design conventions change over time.
Not only does design debt age the user interface, but also carries the inconsistencies that make interfaces much less comprehensible, more difficult to navigate and breaks all sorts of accessibility guidelines
The cohesion and consistency of a design deteriorates over time as new experiments are run and new elements are introduced into the design.
The world in general subscribes to certain design conventions; we have been taught things a certain way, with shapes and colour theory, we know how to identify buttons and form elements, and can predict their intended function from their colour, shape and size.
Sadly, not all design conventions are best practice; conventions include ethically challengeable approaches such as dark patterns which are becoming increasingly difficult to disrupt.
However, if we were to flip conventions on their head and stray from what people know, although it may bring a better user experience in the long run, it would take so much getting used to, training and re-introducing concepts for it to have any uptake. This in itself limits us to what is possible.
Ethnographic research is a method we can use to put ourselves in the shoes of our users, as cliche as it is, it’s a way to see the world from their perspective and consider the ‘wider space’ of how a platform fits in with the rest of their lives.
We can actually experience the problems our users’ experience and try our solution in the same context, combine this with contextual observation and we can create user journeys that we know for a fact fits in with our user’s wider online behaviour.
It provides the researcher with an understanding of how those users see the world and how they interact with everything around them.
I recently got the opportunity to do remote user testing across six different time zones, with people from 5 different nationalities, all users spoke a varying degree of English.
Performing my due diligence on each participant before talking with them allowed me to feel comfortable and confident going into the session that I could treat the participants with the utmost respect and get the best session results possible.
I also recently planned a user testing session, for which James Marriott was test administrator, where participants didn’t speak English and had low tech proficiency and literacy levels.
For these sessions, we had to have Spanish translators in the room, which added an extra level of communication and complication to the testing sessions, but the results we came out with, were still incredibly informative.
We always start with the why and build up to the what, making sure that we plan the testing sessions/user interviews to capture as much information that is useful to as many team members as possible, to avoid questions being asked further down the line.
One of the best ways I found to do this is hosting a collaborative workshop where you can work as a team to plan your user testing session with everyone working on the project. This method is exceptional in making sure we never make assumptions or change the context of which the user we’re testing with is in.
Accessibility has always played a significant role in design, but its recognition and importance from a user experience perspective is only growing, and with it, the question of designing for all.
The problem with designing for all is “compromise”. If you design for everyone you will inevitably have to compromise some features or design aspects to suit the needs of different users.
This can end in a design which is not functional for anyone, which is why rather than designing for all, we tend to design for our core audience but ensure we are inclusive of our entire audience, in a non-discriminatory manner, by considering accessibility guidelines and standards, digital anthropology and percentiles.
In our design process, we design to fit the Web Content Accessibility Guidelines (WCAG), or the Americans with Disabilities Act Standards (ADA), as well as running our own in-house accessibility passes before build releases.
We also follow ISO 9241-210:2010, Ergonomics of human-system interaction, for our general usability standards.
We have recently completed an update to a number of our apps in correlation to making them ADA (American’s with Disabilities Act) compliant. This meant diving into each app and analysing the code and design to ensure:
The OS of each system provides native support for accessibility features and by immersing ourselves in the guidance given in the individual OS interface guidelines, we gained more insight to inform design decisions that allow these accessibility features to work seamlessly.
We’re working to include contextual observations as well as interviewing and testing with users who have certain disabilities, so we can gain quantitative and qualitative feedback on our apps and constantly improve our inclusive design process.
One of the greatest parts of complying with the ADA standards is knowing that the app we’ve produced is now more accessible to the wider audience, getting into more hands and the reduction of frustration points that the users must have been having previously.
It is inevitable that from time to time you will face some tricky ethical issues when it comes to design. How do you go about designing an app that handles life, love, death, pain and a whole plethora of sensitive topics?
We will focus in on handling the user experience of an albeit bleak topic of death, as this is something we have had recent experience with on a design project.
Being such a sensitive subject, how do you conduct user research insights in such a difficult area? Harder still how do you test your product/service to evaluate its effectiveness in dealing with such an issue.
We can break this up into four main sections: User Research, Information Architecture, Ethical Passes and User Testing. We will have a quick look at each of these and then show an example of how we have used some of these methods ourselves.
The best way to start a project is to make sure you know who your users are and what problem you are solving for them. Each user is unique and has different needs, however, we can draw some similarities between certain groups of users.
Doing contextual observation and seeing your users in the context of your problem is an invaluable way to gain insight and learn how certain problems may affect them differently. We can then extract this information and use it to inform our user journeys.
In the context of sensitive subjects some times this can be a difficult problem to approach. You have to choose whether to interview users or whether to observe them from afar.
For example, when researching an app that handles anonymous mortality rates users will likely be more than happy talking about their experiences in an interview, whereas researching an app for those grieving the loss of a relative, talking to users directly can be much more difficult and sensitive and so we’d choose to approach it differently.
Empathy is at the heart of design. Without the understanding of what others see, feel, and experience, design is a pointless task.
The next important step is to iron out your information architecture, to help you understand where everything should fit within your app. It’s at this stage that we should consider our UX vision, and include a backlog of features that we intend to release over the next few years, so that we avoid future design debt. Card sorting activities are a great way to do this.
It’s crucial to consider the space between our product and our user’s journey, making sure we map out all the touchpoints we have and considering their implications outside the user’s direct interactions with our product.
This is the only way we can accurately predict issues further down the line that relate to our user’s experience surrounding sensitive topics and messages. And ensures that we know what ethical topics we are handling even before considering wireframe flows and what this information architecture will look like visually.
Ethical passes are a series of questions or activities we can ask ourselves or perform while we are designing to ensure that we avoid including unethical patterns in our designs.
One example is a ‘Provocotypes’ workshop as detailed by Cennydd Bowles in his book; Future Ethics. Considering all the worst-case scenarios that our product could be part of and prototype this out so we can see the extent of the effect this has while finding ways we can solve them.
User testing is a particularly important part of validating whether the assumptions we have made while designing are correct. Although all the processes above around user research aims to reduce these assumptions it is inevitable that we will still make some. We have to make sure that we validate our designs and design changes with users as they may approach things differently than expected.
While dealing with one recent project around sensitive messages and communications during times of emotional crisis, we spoke to a sample of users about some of the ideas we had to deal with the topics and issues.
The most notable feedback proved invaluable to the project as it allowed us to recognise a few different groups within the audience who all had different ways they would want us to handle the experience.
Some were grateful for messages while other preferred no contact at all, one thing that also stood out was the use of wording and linguistics; we found that the way we communicated in what we thought was a ‘sensitive way’ was interpreted completely differently by each different group of users.
Providing insights into how we could tailor and customise the platform to communicate in a way which suited each individual user’s preferences, rather than making the decision for them.
One of the biggest challenges I faced was making sure what I proposed would cater and be sensitive to everyone, people think and believe so differently, it was a tough one to play out- making sure the terminology and copy is correct and then how we present the idea of death to the mass.
Designing UX to improve our mental health:
Focus on balancing your business goals with the best interests of your users, avoid dark patterns and find ethical alternatives that can achieve the same outcome.
Testing with users from different backgrounds:
Do your due diligence, research and learn about the lives of the people you’re testing with, without making assumptions or changing their environment too drastically.
Creating a platform that’s accessible by all:
Follow the guidelines and test with technology that’s used by your users, ensure that while being mindful of all, you are most importantly designing to be inclusive of everyone in your audience.
Dealing with sensitive topics and communications:
Plan ahead and spend time thinking about all the possible ways sensitive topics could be interpreted, use compassionate language and never leverage any of this for your own gain.
I am constantly looking to improve our processes and find other ways to handle these ethical challenges, but like I said at the beginning of the post; ethics can’t be defined. In the end, all we can really do is follow a set of principles and ask ourselves what’s in the best interests of our user’s at every stage throughout the design process.
Get in touch if you want to discuss ethics, design, UX, mental health or any of the projects I’ve mentioned in this post, and keep an eye out for the blog series to follow!
Published on May 6, 2021, last updated on March 16, 2023