RAchel is a student at a US university who was sexually assaulted on campus. She decided not to report (less than 10% of survivors do). What he did, however, was record the attack on a website that uses new ideas from cryptography to help catch serial sex predators.
The Callisto organization allows a survivor to enter their name into a database along with their attacker’s details, such as a social media handle or phone number. These details are encrypted, meaning the identity of the survivor and the perpetrator is anonymous. If you tamper with the database, there is no way to identify either party.
However, if the same criminal is mentioned by two people, the website notes a match, which triggers an email to be sent to the two lawyers. Each attorney receives the name of one of the survivors (but not the perpetrator). The attorneys then contact the survivors and inform them of the match and offer to help coordinate any action if they wish to pursue it.
In short, Callisto enables survivors of sexual violence to do something unprecedented: they can find out if an abuser is a repeat offender without having to identify themselves to authorities or even identify the abuser by name. They learned something useful and perhaps useful without giving anything away. “Survivors can find healing in knowing they are not alone. They don’t think it’s their fault,” says Tracy DeTomasi, CEO of Callisto. And there is strength in numbers. “Maybe one person doesn’t have a job, but two people do.”
That two strangers can combine their knowledge without giving each other any personal information is a seemingly paradoxical idea from theoretical computer science, fueling what many are calling the next revolution in technology. The same theory allows, for example, two governments to discover that their computer systems have been hacked by the same adversary, or that two banks have been defrauded by the same person without the government divulging classified information. financial data protection laws.
The general term for these new cryptographic techniques, where you can share data while keeping that data private, is “privacy-enhancing technologies” or Pets. They offer data owners opportunities to combine their data in new and useful ways. For example, in the healthcare sector, strict regulations prohibit hospitals from sharing patients’ medical information. However, if hospitals could combine their data into a larger database, doctors would have more information, which would allow them to make better treatment decisions. Indeed, a project using pets in Switzerland has since June allowed medical researchers from four independent teaching hospitals to analyze pooled data from nearly 250,000 patients without loss of confidentiality between institutions. Juan Troncoso, co-founder and CEO of Tune Insight, which is leading the project, says: “The dream of personalized medicine is based on a larger and better database. Pets can make that dream come true by following the rules and protecting people’s privacy rights. This technology will be transformative in precision medicine and beyond.”
The last few years have seen the emergence of dozens of Pet startups in the fields of advertising, insurance, marketing, machine learning, cybersecurity, fintech and cryptocurrency. Governments are also interested. Last year, the United Nations launched a “Pet Lab” that had nothing to do with pet welfare, but instead a forum for national statistical offices to find ways to share data across borders while protecting the privacy of their citizens.
“Pets are one of the most important technologies of our generation,” says Jack Fitzsimons, founder of UN Pet Lab. They have fundamentally changed the game because they promise that personal data will only be used for the purposes for which it was intended.”
The theoretical ideas on which pets are based are half a century old. In 1982, Chinese computer scientist Andrew Yao asked the following question: Can two millionaires discover who is richer without revealing how much? The counter-intuitive answer is that yes, it is possible. The solution involves a process where the millionaires send packets of information between each other using random techniques to hide the exact numbers, but in the end both millionaires are happy to know who is richer. other details of another’s wealth.
Yao’s “millionaires’ problem” was one of the founding ideas of a new field in cryptography—”secure multiparty computing,” in which computer scientists studied how two or more parties interacted with each other so that each party could keep important information secret. and yet all were able to draw meaningful conclusions from the combined data. This work led to a flowering of increasingly remarkable results in the mid-1980s, one of the most dazzling being the “proof of zero knowledge,” in which it is possible to prove to someone else that one has some knowledge. confidential information without revealing any information about it! This allows you to prove, for example, that you solved a sudoku without revealing any details of your solution. Zero-knowledge proofs, as in the millionaire’s problem, involve a process in which the prover sends and receives packets of information in which important details are mixed with randomness.
AA no-nonsense tool in the pet toolbox is “fully homomorphic encryption,” a magical procedure often referred to as the holy grail of cryptography. It allows person A to encrypt a set of data and give it to person B who will perform calculations on the encrypted data. These calculations provide B with a self-encrypted result, which can only be decrypted after it is transmitted back to A. In other words, Person B performed analytics on the database while learning nothing about the data or the results of their analytics. (The principle is that certain abstract structures or homomorphisms are preserved during the encryption process.) When fully homomorphic encryption was first discussed in the 1970s, computer scientists weren’t sure it was possible, and it wasn’t until 2009 that American Craig Gentry showed how it could be done. demonstrated.
These three groundbreaking concepts—secure multiparty computing, zero-knowledge proofs, and fully homomorphic encryption—are different ways of sharing data but not revealing it. In the early years of research in the 1980s, cryptographers didn’t think these innovations could have any practical use because there were no obvious real-world problems to solve.
Times have changed. The world is full of data, and data privacy has become a highly controversial political, ethical, and legal issue. Half a century after Pets were essentially secret academic games, they are now being hailed as the solution to one of the digital world’s defining problems: how to keep sensitive data private while still extracting value from it.
The emergence of applications has driven the theory well enough to be commercially viable. For example, Microsoft uses fully homomorphic encryption when registering a new password: the password is encrypted and then sent to a server that checks whether the password is on the list of passwords discovered in a serverless data breach. to be able to determine your password. Meta, Google and Apple have also introduced similar tools to some of their products over the past year.
In addition to new cryptographic techniques, Pets has also included advances in computational statistics since 2006, such as “differential privacy,” where noise is added to results to protect individuals’ privacy. This is useful in applications such as official statistics, where simple averages can reveal personal information about people from minority groups.
Much of the recent investment in pets comes from cryptocurrencies. Earlier this year, cryptocurrency exchange Coinbase spent more than $150 million to acquire Unbound Security, a multi-party computing startup co-founded by Briton Nigel Smart, a cryptography professor at KU Leuven in Belgium. “Multiparty computing in the blockchain space is now everywhere,” he says. “Will this work in the last year? to be the standard.”
He believes pets will eventually spread throughout the digital ecosystem. “This is the future. This is not fashion. This technology allows you to collaborate with people you may not have thought to collaborate with before, either because it is legally impossible to do so, or because it was not in your business interests to disclose the information. . This opens up new markets and applications that we are only just beginning to see. It’s like in the early days of the internet, no one knew what programs were coming. We are in the same situation as pets.
“I think it’s becoming more and more internal. You see it everywhere. All data will eventually be accounted for by privacy-enhancing technology.”
Current pet applications are niche in part because the technology is so new, but also because many people are unaware of it. Earlier this year, the UK and US governments announced a joint £1.3m prize for companies to come up with ideas to ‘unleash the potential of pets to tackle global social challenges’.
However, some uses are already having an effect, such as Callisto. DeTomasi says 10-15% of survivors who use the site have had matches, meaning attackers have many victims. DeTomasi doesn’t know the names of the match survivors or the attackers because the system keeps them private. (“Rachel” in the introduction is an invented name for illustration purposes.)
However, DeTomasi says that 90% of sexual assaults on campuses are committed by offenders who commit an average of six crimes during a college year. “So if we stop them after two, we prevent 59% of the attacks.” Callisto is currently available at 40 US universities, including Stanford, Yale, Notre Dame and Northwestern, and the plan is to roll it out to all universities. “It’s definitely needed,” he adds, “and it definitely works.”
The Secret Life of Pets
Four of the most important privacy-strengthening technologies
Secure multiparty computing
It allows two or more parties to calculate their shared information without revealing any personal information.
It allows one person to prove to another person that they know something to be true, without revealing any information about how they know it to be true.
Fully homomorphic encryption
Being able to analyze encrypted data without first decrypting it is called the holy grail of cryptography.
A method for adding noise to privacy-preserving data.