Have you ever wanted to set up a closed-circuit security camera on your website? That’s what analytics tools like Usabilla, Mouseflow, and Hotjar let you do. Inject a snippet of code into a website to capture every user interaction imaginable, including page navigation, mouse movements, clicks, taps, and keystrokes. But is it okay to collect so much personal user data?
After some user sessions have been recorded in one of the analytics tools mentioned, product and design teams are able to log in and instantly observe the behaviour of users in a series of bite-sized video clips. Insights from observing these sessions can then be synthesized, cataloged, and used to inform design decisions, uncover usability issues, or squash bugs in the website. Often referred to as session replay, this technique has grown in popularity as the technology has improved and it’s now considered a staple activity in user research.
Until a few years ago, a researcher could only gather detailed insights on a design by simulating the user experience with one-on-one usability lab studies. These types of studies are still considered industry-standard, especially when testing design concepts still in development, but they run into some clear limitations.
The biggest issue with usability studies is they involve recruiting participants who may or may not actually use the digital service and subject them to strict, lab-style conditions where they have to complete a list of tasks with little or no guidance. These sessions can feel unwelcoming and clinical to participants as their behaviour is put under a microscope. They can also set users up to fail, often intentionally for the sake of the test, and easily fall prey to the Hawthorne effect. This cognitive bias suggests that people change their behaviour when they know they are being watched or assessed, as people inherently want to be seen as competent by their peers. It’s difficult to remove biases like this from a study when they are seemingly baked into the format.
There is value in running any research activity, but recently I’ve favoured moving towards remote, unmoderated usability studies for testing undeveloped design concepts. These sessions reduce some of the biases and can be deployed quickly, although they can still be compromised if the test participants you’ve recruited don’t meet the criteria of real users. And even when a remote participant is sent an unmoderated design test to complete, they understand they’re being assessed and may still alter their behaviour for the best results.
Session replay is arguably less biased than usability studies because it gives researchers intimate access into how real users behave in their natural environment. Rather than being scrutinised by a researcher sitting next to a participant in a lab, users are interacting with an app conveniently in the comfort of their home or office. It’s likely the user is completely unaware that they are being monitored, unlike usability studies which require clear consent from the participant.
The feature that makes session replay a powerful, unbiased research tool is also what raises ethical concerns. Let’s use the analogy of a brick-and-mortar retail store. When product teams install session replay tools on a website, they are setting up a sophisticated store surveillance camera that never turns off. In fact, the surveillance camera on the website is much more powerful than a camera in a store. This camera captures more than every movement of the customer–it captures which city they’re visiting from, what browser they’re using, how many pages they visit, how long they look at each page, what website they visited to land on the website, and more. Some tools even track ‘rage clicks’ or how aggressively a customer clicks on elements of a page.
Unsavvy users have no idea how much of their data is being tracked, which is the biggest cause for concern. The brick-and-mortar store will never be able to collect that much information on their customers.
As designers, there are a few tough questions we must ask ourselves. Are we infringing on customer’s online liberties and privacy? Are we willingly contributing to the over-surveillance of the internet? Are we collecting too much personal data for the sake of optimising a design? I know I still grapple with these questions. I don’t know the answer, but my gut says we’re overstepping.
You could argue that when you interact with a website, the relationship goes both ways. The act of going online is in itself a transaction between a user and service providers. And you are sharing your personal details to instantly access an endless world of products and services personalised to your needs, which can come at a cost. It is mutually beneficial.
In response to data concerns, some of the session replay tools have started giving users the ability to view and delete their data if they request. Laws are also being created to regulate the personal data companies can capture. In April 2016, the European Union was amongst the first to act on increased privacy concerns by introducing the General Data Protection Regulation (GDPR), a strict set of rules to protect European users and allow them to consent before providing personal data. If companies fail to comply with these rules, they are subject to eye-watering fines and penalties. In the US, the California Consumer Privacy Act (CCPA) introduced in 2018 includes a section on user’s rights to delete personal data collected on them. In December 2019, the Australian government conducted a review of the Privacy Act 1988 to better protect personal user data and combat issues of data breaches.
These laws and regulations are a great step forward, but part of the responsibility is on the individual user. Unsavvy users need to be made aware that connecting to a web page will require a transfer of data, but that doesn’t mean they should choose to give up all of their liberties. If a user would rather withhold personal data, there are measures they should take to protect themself online, using VPNs, proxies, and other encryption services. Even browsing incognito in a privacy-focussed browser like Brave helps protect users from basic data sharing. The popularity of encryption and identity masking tools skyrocketed in recent years, as web tracking software has become ubiquitous.
This isn’t an ethical discussion that I’m qualified to solve in a short blog post, but it’s a conversation designers need to be having more as we are at the forefront of this technology. Our role is to be the voice of the user in a business, and that could mean putting rules or limitations around what data we need to capture in our session replay activities or at least warning users that we’re storing personal data that they may not be aware of.