background img

The New Stuff

Google’s New Open Source Privacy Effort Looks Back to the ’60s

Google is building a new open-source tool designed to preserve privacy when analyzing large amounts of data. The company’s researchers unveiled their work at a computer security conference this week.
The project, called RAPPOR, builds on a 1960’s-vintage technique called differential privacy. The technique scrambles data sets so they remain statistically sound while preventing any given data point from being traced back to the individual who provided it.
Differential privacy could help protect people from being identified personally as companies and researchers collect and mash up data in search of valuable patterns. For example, a fitness tracking company could publish aggregate data on users’ heart rates without the risk that people with access to the individual data points — including the tracking company’s employees — could identify sensitive health information pertaining to an individual user. Or dating sites could analyze metadata about users without delving into a person’s sexual preferences.
Here’s how it works: Researchers asking a sensitive yes-or-no question tell the subject to flip a coin before answering. If the coin turns up heads, they should answer yes regardless of the true answer. If it comes up tails, they should answer truthfully. This allows the subject to be able to plausibly deny any yes answer while allowing the researchers to compute the statistical occurrence of yes answers.
Google is piloting the technique to track how people use the company’s software. SayGoogle GOOGL +0.02% wanted to see how many users blocked tracking cookies on the Chrome browser. To accomplish this, Google employees typically would have to track the users who didn’t want to be tracked. Using RAPPOR, they would have no way to know the preferences of individual users.
It’s unclear how extensively Google is using the tool, which is an experiment for now. It’s also unclear how complicated it would be for other companies to adopt it.
Differential privacy was originally used in the 1960s by researchers asking subjects to report whether they were infected with sexually transmitted diseases. The researchers wanted to build a system that made it impossible for outsiders, including the researchers themselves, to go back into the database and find individual answers, according to Joe Hall, chief technology officer at the privacy advocacy group, the Center for Democracy and Technology. Hall explained the technique in a recent blog post.

0 comments:

Post a Comment

Popular Posts