Wrestling with the ‘e’ word: data ethics and Unroll.me’s data selling woes

Ellen Broad
8 min readApr 27, 2017

News this week that Unroll.me — a popular tool for organising subscription emails — sold their users’ anonymised email data to Uber has sparked a mass exodus of Unroll.me users.

I’m an Unroll.me user, and I’ve been following the public reaction to their data commercialisation model with interest. I’ve also been writing about data ethics recently, and so have been struggling with two questions:

  1. Was Unroll.me’s management of its’ users data unethical? When should we use that language to describe an organisation’s data practices?
  2. Do people understand what is meant by ‘aggregated and anonymised’ data? Do we have the mechanisms to interrogate it?
Kirsty Ren, Unroll.me screenshot (CC BY SA)

I work through the first question in this blog. I’ll tackle the challenges we face critically interrogating what ‘aggregated and anonymised’ means, in the context of stories like Unroll.me’s use of user data, in a second blog.

People are shocked to find out Unroll.me monetises its customer data

Details of Unroll.me’s data commercialisation model originally surfaced in a NY Times profile of Uber founder Travis Kalanick at the beginning of this week:

Uber devoted teams to so-called competitive intelligence, purchasing data from an analytics service called Slice Intelligence. Using an email digest service it owns named Unroll.me, Slice collected its customers’ emailed Lyft receipts from their inboxes and sold the anonymized data to Uber. Uber used the data as a proxy for the health of Lyft’s business. (Lyft, too, operates a competitive intelligence team.)

Slice confirmed that it sells anonymized data (meaning that customers’ names are not attached) based on ride receipts from Uber and Lyft, but declined to disclose who buys the information.

Within 24 hours of the story breaking, Unroll.me’s co-founder and CEO Jojo Hedaya issued an apology to Unroll.me users in a blog titled “We can do better”:

Our users are the heart of our company and service. So it was heartbreaking to see that some of our users were upset to learn about how we monetize our free service.

And while we try our best to be open about our business model, recent customer feedback tells me we weren’t explicit enough.

Hedaya went on to promise clearer messaging on Unroll.me’s website, app and in its FAQs about its data commercialisation model. His apology wasn’t enough to prevent Unroll.me users taking to social networks to share their anger and sense of betrayal, their concerns regarding the privacy and security of their emails, as well as instructions for other users to delete their Unroll.me accounts.

Is monetising peoples’ personal data ‘unethical’?

When stories first broke about Unroll.me’s monetisation of their user data, I was nonplussed. As has been repeated in the days since, when a service is free typically you’re the product. I’d assumed this was pretty widely accepted — it’s sadly the way most free services work. It’s creepy, it’s unsettling, but it’s not *unethical*. For a data practice to be ‘unethical’, it needs to be ‘wrong’ — it needs to cause harm, entrench disadvantage or exploit vulnerable people (this is what I’d told myself).

The other co-founder of Unroll.me (Perri Chase, who left when Unroll.me was acquired by Slice) similarly assumed people knew about the ‘cost’ of free services, describing anyone outraged with Unroll.me’s commercialisation model in a Medium post as ‘living under a rock’:

It starts at the top with the character quality and priorities of the investment community which, btw is not to be nice to the users it is (shocking) to make money!!! I encourage you to go read the Terms of Service of every app you opt in to in order to see what rights they have over your data. This is not new. Is it good? Is it bad? Is that the point? You optin for an awesome free product that clearly states the following and you are offended and surprised? Really?

But as friends and colleagues — many of whom I count as savvy internet users — continued to express anger with Unroll.me, I began questioning my own assumptions. It’s hard to put a fence around data behaviours that would clearly be ‘unethical’ — the language and our expectations of data practice are still evolving — but there’s a few things service providers like Unroll.me may need to take into account.

One: People decide what’s ‘ethical’ and ‘unethical’. Accepted practice can become unacceptable.

Last year I read The Immortal Life of Henrietta Lacks by Rebecca Skoot, about how one woman’s cancer cells — taken without her knowledge — became one of the most important tools in modern medicine. I was struck by just how significantly societal attitudes towards control over our own bodies changed over the course of the 20th century.

In one section of the book Skoot examines the practices of a well-respected virologist in the mid 1950s, Chester Southam, who began injecting HeLa cells (Henrietta Lacks’ cancer cells) into cancer patients without their permission, and without telling them what the syringes contained. He also conducted experiments injecting HeLa cells into US prisoners, who were commonly used as lab rats for a range of scientific experiments during this period.

These experiments became the subject of intense media scrutiny and debate about unethical medical practice.

There’s a passage in the book where doctors testify before the Board of Regents of the University of the State of New York, who were considering revoking Southam’s medical licence:

Many doctors testified before the Board of Regents and in the media on Southam’s behalf, saying they’d been conducting similar research for decades. They argued that it was unnecessary to disclose all information to research subjects or get consent in all cases, and that Southam’s behaviour was considered ethical in the field. Southam’s lawyers argued, “if the whole profession is doing it, how can you call it ‘unprofessional conduct’?

Wellcome Images, Blood Testing Vacuettes (CC-BY-NC-ND)

Today, the idea that every person has an inalienable right to determine what should be done with their own body is fairly widely accepted (although I’d argue we still haven’t quite accepted that this right applies equally to men and women). Informed consent is now fundamental to medical treatment and research.

We don’t really know which of today’s common data practices will come to be seen as unacceptable and unethical. We’re at the very beginning of of an age of data-driven services and technologies. Ethics and other niceties are still being figured out. What’s clear though is that scrutiny of how organisations treat data is increasing, and stories like the fallout over Unroll.me’s data commercialisation model are growing in frequency and intensity.

Organisations can’t afford to assume that because this is the way data is monetised, people accept it. What’s acceptable now may be unacceptable in five years time.

Societal expectations shape the practices and behaviours that we come to view as unethical. Organisations monetising people’s data need to be conscious of how attitudes towards their practices may change.

Two: Our consent to monetisation of our data is rarely ‘informed’ consent.

The purpose of a data use — for example, the commercialisation of customer data — may be ethical. It might make us uncomfortable, but not be sufficiently harmful or seen as sufficiently ‘wrong’ to constitute unethical practice. But an organisation’s decisions and behaviour surrounding its service — particularly in how it communicates its service to customers —need ethical consideration.

Both co-founders of Unroll.me noted in their blogs that in Unroll.me’s terms of service people are told that non-personal information sent to their email accounts — “data from and about ‘commercial electronic mail messages’ and ‘transactional or relationship messages’” — can be transferred or sold.

I’ll write about what non-personal data means in this context in a second post. Putting that to one side,as co-founder Perri Chase notes, Unroll.me uses the same kind of language used by most applications who process and sell anonymised and/or identifiable customer data.

But that doesn’t mean people understand what they have consented to when they sign up.

In Unroll.me’s case, anger about the business’s monetisation model seems to stem from not knowing two things:

  1. That Unroll.me was owned by a commercial retail data analytics company called Slice Intelligence.
  2. That Unroll.me/Slice packaged up anonymised user email data and sold it to other businesses.

Unroll.me does not communicate that it’s owned by Slice Intelligence on its website, in its Terms of Service or its FAQs. Given the nature of Slice’s business (selling data), this is information that might reasonably affect people’s comfort with Unroll.me’s service.

There’s no obligation for Unroll.me to include this information. But in the context of providing informed consent for a service to read your emails and sell data about your ‘commercial electronic mail messages’ and ‘transactional or relationship messages’, it’s relevant information.

Similarly, what non personal or ‘anonymous’ email data means in practice isn’t communicated. And while Unroll.me might assume that users understand that their data is monetised in exchange for the free service, it’s not made explicit to users that their data is the very basis of Unroll.me’s business model.

Opaque terms of service are widespread, but that doesn’t mean they’re accepted practice by the public. If the way an organisation communicates its use of data prevents customers from making an informed choice about their service — particularly where it involves their personal data — when might this be misleading enough to constitute unethical practice?

Three: A lack of transparency around how personal data is used makes the potential risks hard to measure

In this blog I’ve vaguely distinguished generally poor data practices from data practices that stand to cause harm to people or society — and should be considered unethical.

The problem is, it’s typically hard to assess what, if any, harm might come to people from the sale of their anonymised data. This applies in the case of Unroll.me too. As an Unroll.me user, I don’t know exactly what kinds of information from my emails are being packaged up, and the extent of anonymisation measures that have taken place.

Exactly who purchases this data (besides Uber) and what they use it for isn’t clear. This erodes trust between Unroll.me and its users.

Openness about how data is used, and who it is shared with, demonstrates to customers that an organisation treats data responsibly and that they are trustworthy. Organisations don’t just have a duty of care to prevent harm to people and society from use of their data — they can also make proactive commitments to best practice, to certain ethical standards.

In his blog post, Unroll.me co-founder Jojo Hedaya reiterated a commitment to keeping user data private, and to only share [sell] anonymised data. But what does ‘anonymised’ mean? Do we have the necessary understanding to interrogate statements around anonymisation and aggregation?

I’ve kept you here long enough. I’ll tackle this in blog post number two.

--

--

Ellen Broad

ellenbroad.com. 3A Institute, Australian National University. Data ethics | open data | responsible technology. Board game whisperer @datopolis.