In social analytics, maturity is relative

Photo: Daniel Kleeman, cc 2.0

Photo: Daniel Kleeman, cc 2.0

At Altimeter Group, we’ve been measuring the state of social business for several years now.

Each year brings new shifts. Some are surprising, others not so much. This year’s theme, in “The 2015 State of Social Business,” by my colleague Ed Terpening, the shift is from scaling to integrating. That makes sense from a data perspective too.

I have to admit I had an ambivalent reaction to this particular chart, which shows the relative maturity of various aspects of social business today.

Screen Shot 2015-07-29 at 10.17.05 AM
Social engagement–the ability for an organization to interact digitally with communities at scale–is unsurprisingly first, at 72 percent maturity (those figures are self-reported, by the way.) Coming up next is social analytics, with 63 percent of organizations surveyed reporting that their programs are mature.

But when you unpack these numbers, it suggest a bit of a different narrative. All of these functions–from social selling to governance to employee recruitment–must be measurable, and must have relevant and credible KPIs to demonstrate performance. Do those exist in these organizations?

It’s hard to know, as these capabilities are themselves maturing, and social identity–the ability to match social posts with individuals–has only achieved maturity in about a quarter of the organizations we surveyed.

So what exactly are these social analytics measuring?

In my own work with brands and social technology companies, the answer is highly variable, but there are some consistent themes. Engagement is pretty measurable, but the outcome of engagement is much harder. Social customer service metrics in many organizations have matured to the point that social service levels aren’t too different from service levels in “traditional” channels. Event/sponsorship activation works when there are ways to attribute outcomes to those programs. But we still struggle mightily with the dimming effect of last-click attribution on the actual, meaningful outcomes we all want to see: revenue generation, cost reduction, time-to-hire, etc.

What this chart suggests to me is that we are still at a point when, even though analytics have supposedly matured, the actual criteria for business value (impact of social on sales, on activations, on recruitment, on acquisition, on churn) may still be cloudy for many companies.

The other point–not in scope for this research but a related theme–is the extent to which social data is being tied with other data streams. Anecdotally, I’m hearing far more evidence of this in 2015. Most companies I speak with are looking at social data in context of other business-critical data, but norming remains a challenge. And so social analytics tends to revert to the mean, which in this case means counting volumes rather than gauging outcomes.

For the most part, I agree with what Forrester and others have said–that social analytics need to become more predictive. So I look at this data as a bit of context for brands, and a challenge for analytics companies to take up: we need to focus less on volumes and more on holding our own feet to the fire on what executives really care about: real business indicators and outcomes that suggest meaningful action.

You can download “The 2015 State of Social Business” here.

Posted in Altimeter, behavior, content measurement, Listening, Predictive Analytics, Social Analytics, Social media measurement, Uncategorized | Tagged , , , , , , | Leave a comment

Is your data cheating on you? Five life lessons from the Ashley Madison hack

6322992684_d252f695ed_oIf you’re not one of the 37 million people whose data was hacked in the Ashley Madison breach, you can breathe a sigh of relief.

Sort of.

The Ashley Madison story may be great for a few news cycles of schadenfreude, but it also illustrates the realities we face in the age of data ubiquity: as people, consumers, businesspeople, patients and citizens.

1. Intimate data about us is everywhere. Our purchases, location, sexuality, religion, health history, political party, whose house we went to last night, the stiletto heels or sleek watch or expensive bourbon we clicked on on a website–is out there, somewhere. In most cases this data is protected by layers of security, encryption, policy and regulation, but, as we’ve seen from Anthem to Target to Ashley Madison–it’s not always effective. Beyond data security, however, is the question of how this data is actually used by the businesses that collect it. Is it to deliver better services, products ads? Is it being sold to a third party?

2. Profiling is not just for the FBI. Marketers love profiling. Why? Because good marketers realize that it’s good business to sell you something you are likely to want, rather than wasting your attention (and their money) on trying to sell you something you don’t. So, naturally, they want to know more about you: who you are, what you covet, where you shop, where you live, how old you are and how much money you have, so they can target ads and products and services more effectively. Whoever you are, you’re profiled somewhere: thrifty boomer, young married, millenial hipster; sounds like a Hollywood casting call, doesn’t it? Like any tool, profiling can be extremely effective when properly used, dangerous if not.

3. You leave digital footsteps everywhere you go, and they just may live forever. Everywhere you go, you leave digital traces. Even if you were “just browsing” in a store, you may have left a digital trace if you used a retail app, and/or the store used beacons or shelf weights. Add to that your web, mobile and social activity, and any apps you’ve used. Now imagine a ten-year timeline of that data being used to try to predict your next purchase. Or next spouse.

4. Chances are, you haven’t the slightest idea what data is being collected about you at any given time. If you want to do a simple test, install Ghostery on your web browser for a while. It’ll tell you what data is being collected by the website you’re using. Did you know this data is collected? Do you know how it’s used? I bet not.

5. Your data may be cheating on you. When you clicked “Accept” on any one of a number of apps you used, or bought a book, or downloaded a movie, you may have digitally consented to share this data with third parties. But did you really know what you were consenting to? Sometimes this is a non-issue (some companies will never share your data with others). Sometimes it can have uncomfortable implications, as when Borders declared bankruptcy, and decided to sell one of its greatest assets–its customer purchase history. (The FTC stepped in and required Borders to provide an “opt out” option).

To be clear, I’m not saying any of this is inherently bad, or suggesting we can roll back the clock; it’s just reality these days. But as data becomes more intrinsic to our lives and our business, I believe in finding “teachable moments” anywhere we can:

  1. As individuals, there will never be a better time to educate ourselves about what tradeoffs we are making, consciously or unconsciously, with our our data.
  2. As business people, we need to decide what kind of data stewards we will be, especially as data becomes more ingrained in business strategy.
  3. As an industry, we need to start putting clear and practical norms in place to clarify these issues so that we can have a fair and productive conversation about them and, frankl,y set a good example.

I’ve outlined a lot of these issues and recommendations in The Trust Imperative: A Framework for Ethical Data Use. If you’re not lying on a beach somewhere, I’d love your thoughts and feedback.

 

Posted in altimeter group, behavior, data privacy, data security, digital ethics, Ethics, Internet of Things, Predictive Analytics, Privacy, social data ethics, Susan etlinger, Uncategorized | Tagged , , , , , | Leave a comment

Altimeter Group Joins Forces with Prophet

In the flurry of excitement today, I wanted to make sure to mark a momentous occasion; the company I work for, Altimeter Group, announced today that we have been acquired by Prophet. It’s a great move for both teams; we have similar clients, outlooks and cultures, which always makes for a great partnership. And we can help each other in may ways.

Here’s a video describing the new relationship, and what we hope to do together…

…and a nice story by Anthony Ha in TechCrunch. I’m thrilled for Charlene, who has taken this company to an important milestone, and for my colleagues old and new.

More to come!

Posted in Altimeter, altimeter group, Uncategorized | Leave a comment

What Brands Can Learn From Pinterest’s Privacy Updates

Screen Shot 2015-06-30 at 10.17.03 AMIn the midst of all the complexity and fear about data usage and privacy, it’s nice to see an example of disclosure done well.

A couple of weeks ago, Pinterest announced Buyable Pins, which will enable their users to buy products directly from Pinterest on iPhones and iPads. Like any new feature, this one comes with data privacy implications: if I buy something on Pinterest, both Pinterest and the seller will have access to this transaction information–and possibly more about me.

I’m a Pinterest user myself, so last week I received this email.

Screen Shot 2015-07-06 at 9.59.26 AM

Long story short: Pinterest and the seller receive enough information to complete the transaction, facilitate future transactions and make promotions more relevant to me. If I don’t want to share information to customize my experience, I can turn it off. Short, sweet and to the point.

If I want more information, Pinterest’s privacy policy covers a range of other issues in similarly clear language. The other thing I like about it is that it prompts me to dig deeper if I want to. Clearly, this should be true of any privacy policy update, but the naturalistic and concise nature of the language makes that process a little less initimidating.

I asked the Pinterest team what they were trying to achieve with the privacy language, and here’s what they told me:

Buyable Pins has been a highly requested feature, so we wanted to make sure the language for the policy was clear right from the start. The goal was for Pinners to have an understanding of why the updates are being made, how they can customize settings, and where they can learn more. The approach was similar to past policy updates, where we aim to put Pinners first and be as helpful and concise as possible.

There are two really important issues at play here: 1. people have been asking for this feature, so there is going to be a lot of scrutiny among the pinner community; and 2. Pinterest is now dealing with people’s money. So there’s a lot at stake.

Privacy Policies in Context

Two weeks ago, we at Altimeter Group published The Trust Imperative: A Framework for Ethical Data Use. The central framework in this report combines the data life cycle with ethical data use principles developed by the Information Accountability Foundation (IAF).

Screen Shot 2015-06-25 at 9.55.06 AM 1

The Pinterest privacy policy explicitly fits into two areas of the framework:

  • Collection and Respect. Have we been transparent about the fact that we collect data?
  • Communication and Respect. Have we communicated clearly about what information we collect, and why?

This is why our use of language is an ethical choice:

While dense and legalistic language may satisfy the legal team, clear and simple language demonstrates respect for the user. 

You could further state that Pinterest, like many other ad-supported sites, is arguing that increasing the relevance of promoted pins is a benefit to pinners, which would cover Collection and Benefit as well. [That argument only holds up if users agree that the benefit is worth the exchange of data.]

This is not to say that a privacy policy is the only thing organizations need to consider when it comes to ethical data use. Many other issues have gotten organizations into hot water, whether in courts of law or public opinion. Some top-of-mind examples include Borders (for attempting to sell customer transaction data as part of its bankruptcy process) or Anthem and others (for data breaches). These examples map to Respect/Fairness and Usage, and Respect/Fairness and Storage and Security, respectively.

But now that the framework is out, I will be testing it (and suggest you do too) against real-world examples, using the IAF principles and the data lifecycle stages to examine and illustrate examples of ethical data use in theory and, most importantly, in practice.

Posted in Analytics, data privacy, digital ethics, Ethics, iPad, Pinterest, Privacy, Social Data, social data ethics, Uncategorized | Tagged , , , , , , | Leave a comment

The Trust Imperative: A Framework for Ethical Data Use

Screen Shot 2015-06-25 at 9.58.13 AM 1Consider this: Consumers don’t trust the way organizations use their data. CEOs are concerned that lack of trust will harm reputation and growth. People who don’t trust companies are less likely to buy from them. Yet the default option for businesses using consumer data is that it’s a right, not a privilege.

That tide is turning, and quickly.

Altimeter Group’s new report, “The Trust Imperative: A Framework for Ethical Data Use,” explores the dynamics driving people’s concerns about data use, recent research about their attitudes and behaviors, and proposals by industry leaders such as The Information Accountability Forum, The Governance Lab (GovLab) at New York University and The World Economic Forum.

Our objective for this research is to propose an approach for data use that reveals insight and honors the trust of consumers, citizens and communities.

And, while ethical data use is a fraught issue today, it will be even more so in the near future. As predictive analytics, virtual reality and artificial intelligence move into the mainstream, the implications (and capabilities) of data will become even more urgent and complex.

We no longer live in a world where privacy is binary; it’s as contextual and fluid as the networks, services and devices we use, and the ways in which we use them.

The next step is to make the topic of ethical data use–admittedly a broad and undefined one–pragmatic and actionable. We do this by bringing together the principles of ethical data use developed by the Information Accountability Foundation (IAF) and the specific stages of data use into a cohesive framework.

Our thesis is simple: ethical data use must be woven into the fabric of the organization;
weakness in one area can leave the entire organization exposed.

Screen Shot 2015-06-25 at 9.55.06 AM 1

In addition to this framework, the report lays out an argument for why ethical data use is a brand issue, annotated by examples from multiple industries. It includes actionable recommendations to enable organizations to apply these principles pragmatically.

Clearly, this is just the beginning; we will continue to deepen this research and learn best practices from academics who are exploring these issues and businesses who are faced with these questions and challenges every day. We are also in the process of building a roadmap for ethical data use from this framework that will help organizations assess and remediate the risks (and uncover the opportunities) related to their use of data.

My thanks to everyone who, implicitly or explicitly, contributed to this report. Most importantly, I’d like to express my deepest gratitude to the Information Accountability Foundation, whose excellent “A Unified Ethical Frame for Big Data Analysis” underpins this work. I would also like to thank my colleague Jessica Groopman, who collaborated on the research and framework development, and whose excellent research on “Privacy in the Internet of Things” will be published shortly.

As always, we welcome your feedback, questions and suggestions as we work to add clarity and action to a complex topic.

 

Posted in altimeter group, Analytics, Artificial Intelligence, behavior, Big Data, data privacy, Data Science, digital ethics, Ethics, Internet of Things, Law, Privacy, social data ethics, Uncategorized | Tagged , , , , , , , , | Leave a comment

Is Twitter Obligated to Preserve Politicians’ Deleted Tweets?

Twitter_logo_bluePoliticians everywhere are probably celebrating like crazy this week.

The Sunlight Foundation, an organization dedicated to making politics accountable and transparent, was just told by Twitter that it would no longer have access to its developer API, which enabled the site’s “Politwoops” app to track politicians’ deleted tweets. The story has been covered in news outlets from Gawker to The Washington Post, but the consensus is pretty much the same: Twitter was wrong.

This little story explains a lot about why the digital medium is so confoundingly hard. As citizens, we want to know that politicians’ communications are preserved; after all, as Philip Bump argued in the Post article, “The reason Anthony Weiner is no longer a member of Congress is because he sent a photograph of himself in his underwear to someone on Twitter.” Politwoops is further credited with capturing deleted tweets from politicians on everything from Cyndi Lauper’s hotness to backtracking on Bowe Bergdhal.

It’s a fair point. Richard Nixon resigned because the release of the Watergate Tapes (and revelations about that 18-minute gap) made it impossible for him to continue to serve as President.  As Justice Louis D. Brandeis famously said, “sunlight is the best disinfectant.”

But unfortunately it’s not that simple in this case. Bear with me.

The Sunlight Foundation issue brings back the question, which has existed since the beginnings of the Internet, as to whether a site such as Twitter is more like a magazine (which created the information it contains) or a newsstand (which simply makes it accessible to the public). My bet (and I have no inside information here) is that Twitter would argue that it is more like a newsstand, where the user (in this case the politician) is the magazine. So if a politician decides to remove content, Twitter would have no obligation (and actually might incur liability) by replacing it. This example also shows the limits of interpreting 21st-century realities using 19th century concepts, but that’s another issue for another day.

So, as a legal expert explained to me, the public may have an interest in seeing deleted tweets, but in the ordinary course, they wouldn’t have a right to see them. I imagine that the phrase “in the ordinary course” is the key here; if there were a court order, that would possibly be a different story, and the Twitter lawyers would have to hash it out with the politician’s and the prosecutor’s lawyers. 

Now this isn’t to say that those deleted tweets can’t be captured at all; it just means that Twitter will not be party to it by providing access to its developer API. Doing so would mean the company is selectively enforcing its own terms of service, which could compromise the trust of all users. How would we then know who they enforce it for, and who they don’t? This issue becomes even more complex outside the United States, where privacy norms and the “Right to be Forgotten” legislation in the EU add a completely different dimension. We can’t forget that Twitter is a global company, after all.

So there you have it. As a citizen, I wish Twitter had made a different choice. As a user, I’m relieved they didn’t.

Posted in data privacy, digital ethics, Digital Media, Law, social data ethics, Social media, Twitter, Uncategorized | Tagged , , , , , , , , , | Leave a comment

Tim Cook Just Threw Down on Data Privacy, And It Was Awesome

4098785822_3087efa50a_o

Photo: Jessica Paterson, cc. 2.0

Apple CEO Tim Cook gave what TechCrunch called a “blistering speech” on data privacy Monday night at the Electronic Privacy Information Center (EPIC) “Champions of Freedom” event.

“I’m speaking to you from Silicon Valley, where some of the most prominent and successful companies have built their businesses by lulling their customers into complacency about their personal information,” Cook said. “They’re gobbling up everything they can learn about you and trying to monetize it. We think that’s wrong. And it’s not the kind of company that Apple wants to be.”

Cook’s speech has sparked thousands of news articles and questions about his motives, plans, wisdom in poking the bear(s), whether Cook–and Apple–is really prepared to address privacy head-on, and what it all means for the ecosystem of digital products and services.

After all, as Vala Afshar has said, “if the service is free, then you are the product.”

We all know that, right?!? Can we please just move on?

Tempting as it may be for those distracted by all that juicy data, there is a second, critical question.

What do we do about it? As technology developers? Organizations? Consumers?

This is a conversation that has to happen. Seriously. Now. With action. And clear outcomes.

Privacy is not about some vague, rose-colored future, it’s about trust, and what happens when consumers distrust the organizations with which they interact.

  • PwC has said that, as of last year, fifty percent of CEOs surveyed identify trust “as a real threat to their growth prospects.”
  • The World Economic Forum is examining what trust means, and how to decode it.
  • The Edelman Trust Barometer reveals how lack of trust actually affects consumer behavior (hint: it’s not good).
  • We at Altimeter Group are working on research on consumer attitudes about data privacy, and a framework for ethical data use, to come.

It doesn’t have to be a zero-sum game. Just because we honor privacy on one hand, doesn’t mean we have to limit innovation on the other. It just demands a new calibration of what innovation means, and what other models we can imagine to support both insight AND trust.

Tim Cook just threw down a pretty big gauntlet for the industry (and for Apple). Facebook, who he called out in his speech, is doing promising work with DataSift (disclosure: client) to deliver privacy-safe “topic data.” I don’t know what plans, if any, Google has in this direction, but the momentum of this issue can’t be lost on them.

There are no easy answers, but there are informed choices to be made. And better do it now, before the wave of predictive analytics, Internet of Things, augmented and virtual reality and other technologies yet to be invented really get going.

</endrant>

Want to talk to us about data privacy? Have something valuable and unique to add to the conversation? Please let us know.

 

Posted in Altimeter, Artificial Intelligence, data privacy, data security, digital ethics, Ethics, Facebook, Privacy, social data ethics | Tagged , , , , , , , , , , , | Leave a comment