Quantcast
Channel: Practical Insights on Open Data in International Development
Viewing all articles
Browse latest Browse all 58

Don’t Be Strava: Six Ways to Break Your Data Stockholm Syndrome

$
0
0

Mogadishu Strava Data

In the last few days, there has been a lot of hand-wringing after Strava, a run tracking app, published a global heat map of its users’ exercise paths. As it turns out, in many of the world’s most dangerous locations, the only people using social exercise tracking apps are expatriate military and humanitarian staff, making it fairly easy to identify their location – bad news given that they are high-risk targets.

Predictably, a lot of blame is being thrown around. Congress is demanding answers, the military is reviewing its security policies, Strava is pushing for soldiers and humanitarians opt-out of location sharing, and commentators have declared the end of secrets.

The outrage was prompted by recent media coverage of an Australian student identifying remote military outposts, but Strava published the updated version of the map in November of last year. The backlash isn’t because the data was published, it’s because the media, military, and public are realizing what can be done with it. And it’s easier to express breathless surprise than acknowledge this is the obvious outcome of our Data Stockholm Syndrome.

Stockholm Syndrome is a name for the, “irrational[ly positive] feelings of some captives for our captors.” Data Stockholm Syndrome is the idea that we irrationally celebrate companies who capture lots of data about us – and then seem eager to blame everyone involved when we realize what that data can do.

Is Strava at Fault?

Let’s start with Strava – a company whose entire business is explicitly based on users sharing their exercise data. To Strava, data sharing is a feature, not a bug, and although the CEO has committed to working with the military to protect data, they don’t think they’ve done anything wrong.

Sure, their privacy settings are hard to navigate and their terms of service are open-ended – but that sends a clear, unfortunately common, message. They are here to share our data and there’s no reason to think they’d do anything else. Blaming Strava is like being surprised that you got kidnapped, after hiring someone to kidnap you.

Or the Runners?

It’s also hard not to sympathize with the runners, who simply kept tracking their workouts. Maybe they’d gotten guidance to turn off all third-party apps, as the Marine policy suggests, but probably not.

We all know, conceptually, that good data hygiene is important – but how often is it a priority? Even where there are privacy settings – as Zeynep Tufekci pointed out in the New York Times, it’s impossible to know all the ways that data can be used against us.

And platforms and employers prefer to hold individuals responsible instead of building realistic solutions. Sure, individuals can do things that increase or decrease how easy it is to publish their data, but the digital economy is designed to capture our data. Blaming users is like blaming a victim for getting kidnapped.

Or Civil Society Organizations?

Which is not to say that civil society organizations have it easy – there’s no practical way to track the full range of commercially available applications, their data policies, or the number of ways they could collect or lose control of data and cause harm.

Civil society organizations may understand how dangerous their digital privacy situation is, but that doesn’t mean they can monitor the entire market. Google removed 700,000 malicious Android apps from its store in 2017 alone.

And, if we’re being really honest, most civil society organizations are probably just glad that the public backlash is against Strava – on any other day, this could have come from one of the many open data projects that routinely publish maps of very specific behaviors.

Even if it didn’t come directly from a civil society organization – how many groups have built data sharing partnerships with technology companies, whether directly (like with Airbnb, IBM, and Google) or indirectly through things like open data exchanges? Just yesterday, the head of UNICEF said that their new strategy is to co-create humanitarian interventions with businesses, based on their data.

What Can We Do to Protect Ourselves?

The knee-jerk reaction is to push for government regulation or human rights. While we should always work toward better laws, there are a lot of flaws to putting all our eggs in that basket. As I’ve written before, the law isn’t good at classifying data or risk, regulators have limited jurisdictions, and courts are terrible at adjudicating these kinds of cases.

Even worse, the 2018 World Justice Report declared “a crisis for human rights,” after showing that rule of law and human rights systems are weakening in 2/3 of the countries measured. In other words, when it comes to data capture – the law isn’t particularly effective, and where it is, it’s good at punishing the criminal, not saving the victim.

The good news, is that like protecting yourself from kidnapping – there aren’t perfect answers, but there’s a lot of little things we can do that, together, make a big difference. We don’t need international treaties to write better procurement contracts, we don’t need government interdiction to have workplace policies, and we don’t need commercial regulation to negotiate privacy policies that do more than boilerplate terms of service.

Six Actions Your Organization Can Do Right Now

1. Do Threat Modeling

Not every office is sensitive, nor is every population – and the way we ask employees, partners, and visitors to adjust their behavior should be responsive to the threats we face. The once-blurry line between commercial and state-sponsored surveillance is all but gone.

If your organization works in sensitive places or with vulnerable populations, you should extend your physical security practices to your team’s digital footprint. It’s also important to recognize that digital threat models (and the value and risks of data) change over time – so models should over-prioritize risk mitigation (whether us or those we serve) and need to be translated into operational processes.

2. Manage Digital Footprints

It’s one thing to have a policy, it’s another to make sure that, when new staff or contractors arrive, they adjust. While this might seem invasive, we have too many examples of companies changing their minds or getting hacked to ignore the risk.

Whether through acts of commission (like Roomba and WhatsApp) or omission (like Equifax, Facebook, and Yahoo!), companies have given us plenty of proof that data, once collected, is not secure. And, if you think about it, this isn’t new.

Most smartphones have airplane modes for exactly this reason – and a number of government offices have phone checks. Your organization may, or may not, need to limit the digital footprint of those that come through your doors – but you should know whether you do, and you should ask your visitors to respect your safety.

3. Update Employment and Partner Contracts

The biggest threat to any data system is, without rival, us. Human failure – whether an actual mistake, or just the failure to exercise appropriate digital hygiene – is one of the largest threats to the data and data subjects that civil society organizations support.

We can’t always change the apps our employees and our partners use – but we can set standards for if and when they use them when they work with us. If you think about it, our contracts routinely address confidentiality and non-disclosure.

We need to update those documents to reflect our digital context, and the new companies trying to access our work. Just as we set limitations and conditions around the way that we use public money, organizations must negotiate for more defined and limited use of public data.

4. Review Terms of Service

At this point, it’s clichéd to say almost anything about reading terms of service – or to believe that they matter. And yet, the Strava example should be a wake-up call to many, these contracts do matter.

Organizations – particularly large organizations – have the power to make big purchasing decisions, and to demand consideration from those they pay for service. Technology platforms are vendors, and like all vendors, there won’t be any progress until we at least begin the negotiation.

Civil society organizations, funders, and, frankly, users, can start spending their time and resources to demanding more limited data licenses from technology companies. And if you think that civil society organizations aren’t rich enough to get their attention – just look at how many big data companies are trying to partner with civil society groups to get access to their markets.

We will continue to lose 100 percent of the negotiations we’re not willing to start – and not on social media, but in the contracts that matter.

5. Understand Open Data vs. Data Subjects

Open data is a cumulative issue – the more data that’s in the public domain, the easier it is to draw the correlations and inferences that reveal sensitive information. There have already been significant studies that demonstrate data scientists’ ability to re-identify an individual based on as few as four random data pointsStrava released 13 trillion pieces of location data.

Regardless of whether your organization is individually releasing location data, any data you do release, now, can be combined with all of those trillions of data points, among many other data sets.

Professional best practice to manage this tension is to analyze individual data releases for human rights, risk, or rewards of a particular release. That’s good practice, but realistically, many risks are unknowable. Instead of focusing on assessing the risk posed by a data set, we should focus on the risk posed to the data subject. For civil society, the priority should be the data subjects, not the data sets – and our data release policies should reflect that.

6. Collect Less Data

Finally, there is a lot of talk about ways to minimize data, to treat it responsibly, or to design ethical systems around data – all of which are necessary, and none of which are sufficient. There’s a lot to be said for minimizing our own data footprints and for only collecting data that we know we need. But that won’t help protect data subjects if we do it in partnership with organizations, or using platforms, that don’t have the same values.

As counter-intuitive as it is to the culture of civil society groups, if we’re going to abide by the principle of do no harm, we have to understand the harms, our partners, and our digital interventions.

More often than not, when we think of law, we think of big court rooms and expensive lawyers. Or we think about treaties and human rights. Those are both valuable, but that’s not the most common form of law in technology. In technology – and data – law, contracts are what determines our rights. And the good news is that contracts are negotiable.

That’s not easy to suggest that it will be easy. To belabor the earlier metaphor, negotiating with cash-rich technology companies, or platforms, can feel like hostage negotiations. But when humanitarians are the safety net for the most vulnerable – that’s exactly what they’re doing. And if we don’t start negotiating for our data, and stop admiring its captors, we’re just as likely to be the next victim.

By Sean Martin McDonald, CEO of FrontlineSMS

The post Don’t Be Strava: Six Ways to Break Your Data Stockholm Syndrome appeared first on ICTworks.


Viewing all articles
Browse latest Browse all 58

Trending Articles