Digital Ethics: New Year’s Resolutions for 2015

Source: freeimages.com

Source: freeimages.com

In my last post, I discussed some themes for 2015, one of which was an imperative for us as an industry to get serious about digital ethics.

The year was filled with stories–some surprising, some alarming, some downright nuts–about the downstream consequences of decisions about how we deal with data. Consider the following:

  • Seeking to prevent suicides, “Samaritans Radar” raises privacy concerns. In October 2014, the BBC reported that the Samaritans had launched an app that would monitor words and phrases such as “hate myself” and “depressed” on Twitter, and would notify users if any of the people they follow appear to be suicidal. While the app was developed to help people reach out to those in need, privacy advocates expressed concern that the information could be used to target and profile individuals without their consent. According to a petition filed on Change.org, the Samaritans app was monitoring approximately 900,000 Twitter accounts as of late October. By November 7, the app was suspended based on public feedback.
  • Facebook’s “Emotional Contagion” experiment provokes outrage about its methodology. In June 2014, Facebook’s Adam Kramer published a study in The Proceedings of the National Academy of Science revealing that, in their words, “emotional states can be transferred to others via emotional contagion, leading people to experience the same emotions without their awareness.” In other words, seeing negative stories on Facebook can make you sad. The experiment provoked outrage about the perceived lack of informed consent, the ethical repercussions of such a study, concern over appropriate peer review, privacy implications, and the precedent such a study might set for other research using digital data.
  • Uber knows when and where (possibly with whom) you’ve spent the night. In March 2012, Uber posted, and later deleted, a blog post entitled “Rides of Glory,” which revealed patterns, by city, of Uber rides after “brief overnight weekend stays,” also known as the passenger version of the “Walk of Shame.” Uber is later criticized for allegedly revealing its “God View” at an industry event, showing attendees the precise location of a particular journalist without his knowledge, while a December 1, 2014 post on Talking Points Memo disclosed the story of a job applicant who was allegedly shown individuals’ live travel information during an interview.
  • A teenager becomes an Internet celebrity—and a target—in one day. Alex Lee, a 16-year-old Target bagger, became a meme (@AlexFromTarget) and a celebrity within hours, based on a photo taken of him unawares at work. He was invited to appear on The Ellen Show, and was also reported to have received death threats on social media.

What these stories have in common is that they center on the way organizations collect, analyze, store, steward, aggregate and use data, both actively and passively, as well as how they communicate about their intentions and actions. I’ve had dozens of related conversations with folks in business and academia this year, including of course my fellow board members at the Big Boulder Initiative, on just how we develop an ethics for digital data, and one of the main themes and frustrations is just how amorphous it all is.

Earl Warren, former Chief Justice of the United States, once said, “In civilized life, law floats in a sea of ethics.” So my New Year’s resolution is to begin a process of filtering that sea so we can better understand its component elements. I’ll be starting that process in a document we’ll be publishing in the first quarter, and then in more detail in ongoing research on digital ethics. In the meantime, I wish you all a happy, safe and restful new year!

This post was written as part of the Dell Insight Partners program, which provides news and analysis about the evolving world of tech. To learn more about tech news and analysis visit TechPageOne. Dell sponsored this article, but the opinions are my own and don’t necessarily represent Dell’s positions or strategies.

 

About susanetlinger

Industry Analyst at Altimeter Group
This entry was posted in Big Data, Data Science, Ethics, Real-Time Enterprise. Bookmark the permalink.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s