The Future of Our Past – Part 2
I’ve talked before about data and how it now defines our identity and our memories. We’re now producing more data on a daily basis than through almost the entirety of history. The digital traces we’re leaving behind with every click, every tweet and even every step that we make create a time machine for ourselves. These traces of our existence form the photo album of a lifetime. We don’t have to rely on memory alone but can turn to technology to augment our biological memories and virtually remember everything.
With the prospect, however, of essentially outsourcing our memory to the world’s largest corporations and VC-backed startups alike, there are some far-reaching implications that need to be addressed. These centre around ownership, privacy and longevity.
Who owns the content that we create through the connected devices and digital services with which we now live our lives? Is this purely a case of getting some T&C's right? This issue is more complex than it might seem – just consider the case of mutual ownership. When you leave a comment on a friend’s Facebook timeline it becomes a shared memory. You don’t want to give someone the power to take this away from you, but you also need to allow people to protect themselves from potential harassment. Ownership is a messy business. Perhaps access is more important than actual ownership.Some of you might remember Gowalla. It used to be Foursquare’s arch-rival during the location check-in wars. Gowalla users were encouraged to record the places they explored through check-ins while collecting badges for their virtual passport. It was a lot of fun, and gorgeously designed. I often looked proudly at my passport, recalling the experiences I had at all these places with the beautifully designed badges. The “incredible journey” lasted 3 years until Gowalla shut down after it they were acquired by Facebook. What happened to the beautiful badges that I have been collecting until then? What happened to the photos I’d uploaded of the locations I’d visited? What happened to my passport? Gowalla co-founder Josh Williams announced shortly after the acquisition, “We plan to provide an easy way to export your Passport data, your Stamp and Pin data (along with your legacy Item data), and your photos as well. Facebook is not acquiring Gowalla’s user data.”
Fast-forward three years and the promise still stands on Gowalla’s public Tumblr but none of Gowalla’s valued users have ever seen a scrap of their data again. Your passport may still be yours, but it is locked away in a place that nobody dares open up. Williams expressed his disappointment on Quora and explains that the data can only be made available with the approval of Gowalla’s board. In other words, the collective memories of those people who made the service what it was are at the mercy of a handful of investors who got very little out of the whole deal anyway.
I have learned my lesson from the Gowalla fiasco: As a user maybe you won’t ever own your data, but you always need to make sure that you retain access, so that you can get out what you put in. With every service we sign up for, we’re at the mercy of their TOS, and they could change their stance towards ownership at any time. Publishing content on your own website and syndicating it elsewhere (known as POSSE) is, one way to retain ownership, but for most consumers this is not a feasible option. Digital services need to provide sufficient data export tools and APIs, so users retain full access to their digital memories.
Privacy is, to a degree, a social norm that is culturally created. We’re now all pretty comfortable to be surrounded by camera phones, for example. A few years ago that was a very different story and we had similar concerns to those expressed about Google Glass now. Privacy in society is malleable and you only need to glance at some social networks to see that people are willing to share things online today that seem completely alien to older generations.
But privacy is also a human right. The right to privacy, according to the Human Rights Act, protects us from state surveillance and intrusion into our personal lives. This is important because it is fundamentally rooted in human dignity. A government that employs blanket surveillance of its citizens – as brought to light in June 2013 in revelations by whistleblower Edward Snowden – undermines trust, autonomy and freedom of expression.
So, while you have every right to “overshare” whatever you deem newsworthy on Twitter and Facebook, be it photos of your cat or your lunch or how you’re feeling about a certain X-Factor candidate, this doesn’t automatically revoke your right to be free from excessive state surveillance. Digital products need to respect and uphold this right for their users. Storing user data should be a privilege, not a right. The very minimum any digital product creator can do is to tell their users clearly who they are, what data they’re collecting, for what purpose and who the data will be disclosed to. We ought to put users in charge of who has access to their memories. Protecting your user's dignity, however, doesn’t stop at giving them control.
In February 2014 an operation called “Optic Nerve” became known to the public through documents leaked by Snowden. GCHQ has tapped internet cables and intercepted the Yahoo webcam chats of 1.8 million users in a six month period alone. It’s been revealed that the British government has not only eavesdropped on, but also stored large quantities of sexually explicit images that unsuspecting users of Yahoo video chats have been transmitting. At no point were these people suspected of any wrongdoing, the government spies stored naked pictures of people of all ages and sent them to their partners at the NSA, simply because they could. If this isn’t violating people’s dignity, I don’t know what is. It appears that Yahoo did not know about this activity and condemned the operation. It’s apparent, however, that they hadn’t taken precautions to encrypt this data sufficiently to protect it from the prying eyes of a paranoid government.
As creators of digital products that everybody today uses to go about their lives, to communicate, and to have experiences of any kind, we have not only a legal but also a moral obligation to protect people’s privacy and dignity in two important ways: Firstly, by being honest and transparent with our users about the extent and use of their data and, secondly, by taking every precaution possible to protect their memories from unauthorised access and excessive surveillance.
In an age where many of the world’s memories are born digital, one of the biggest concerns is longevity. Up until now we have seemed to simply accept that digital services are ephemeral. They shut down as fast as they appeared. Like Gowalla, it can happen any day to any of the apps and services you’re currently enjoying or relying on.
That’s not surprising. After all, everything ends. It’s just the law of the universe. And yet, when we talk about technology and digital services we always talk about how we can onboard users and what features we’re going to add in the future. What we rarely talk about, however, is what happens at the end of the service or when people close their accounts.
From many people’s perspective, they are given a phone to take pictures of their children and send them “to the cloud”. Before they know it, the weather has changed and this cloud has disintegrated – and they’ve lost all their precious, irretrievable memories to another “pivot” or an “acquihire”. The World Wide Web is a quarter of a century old now – even as it matures, it has already become an incredible culturally significant store of our collective memory. Facebook is now the number-one place where humans on Earth store their history, says Internet historian Jason Scott. He and his “Archive Team”, a collective of rogue archivists, work day and night to save our modern history by downloading the data from any digital service that is about to shut down – because these memories might otherwise be lost forever.
A Silicon Valley startup mentality may be at odds with taking responsibility for our memories. When you’re dealing with the memory of humankind, you can’t have an exit strategy purely concerned with how you’ll get your ROI. Maybe, in order to secure the future of our digital memories, we need to think less like startups and more like end-ups. When we, as makers of digital products and services, take in user content – whether people upload their baby pictures or we’re asking them to tell their stories – we’re building trust with our community. We’re asking them to put their lives online. It’s important to nurture this trust. If you have an import function, you need to have an export function. Let’s think about the longer term aspects of the customer lifecycle. Let’s think about closure. Let’s start with the end.
What can be done?
All this being said, there are some things digital product creators can implement to address these issues:
- Give your users a simple way to see the data your service is holding about them. People have a legal right, under the Data Protection Act, to see what information you hold about them, but that doesn’t mean you have to treat this concern like a legal process. Apply the same usability principles to your data export feature that you apply to the signup process. Remember that transparency builds trust.
- Deleting data means erase, not archive. This seems counterintuitive to the idea of preserving digital memories, but ultimately this is about control. Even when the right to be forgotten has not been written into law (yet), you ought to respect your users’ desire to control the information about past events in their life.
- Provide granular and easy to understand privacy controls. In the complex virtual social networks that we have a part in creating and that we expect people to populate, privacy is not a simple on/off switch. Remind your users frequently who can see their information and give them tools to control this visibility easily and on a case-by-case basis.
- Consider the design of your API. Digital products don’t exist in isolation. Users increasingly expect different apps and services to work together and exchange data. Designers should consider how they can design the integration of third-party services in a way that enables users, rather than just developers, to understand and control this information exchange. An interface like IFTTT is a great example of how users can be empowered to control the movement of data across services.
- Think beyond an exit strategy. Evernote, the service that wants to help you “remember everything” was, famously, set up as a 100-year startup. This strategy encourages sustainable thinking. Think about the value that your product can deliver to your users over a lifetime.
Consumers need to be aware of the issues surrounding ownership, privacy and longevity so they can choose wisely whom they entrust their memories to. But it’s also the responsibility of designers, engineers and entrepreneurs to create sustainable services and tools that are up to the challenge of holding our collective memories. This means ownership, privacy and longevity should be essential parts of your product strategy.
People live many of their everyday experiences entirely through digital services. As designers and creators of digital products we have an obligation to treat the content that people are creating through our tools with the utmost respect and responsibility. Behind every status update, every Instagram, every WhatsApp message there’s a human story. This is the memory of humankind, our history.