Cryptographers love a puzzle, a problem to solve – and this one has it all. Indestructible codes, secret notes, encryption and decryption.
Here’s the riddle: someone wants to send a secure message online. It has to be so private, so secret, that they can deny they ever sent it. If someone leaks the message, it can never be traced back to the sender. It’s all very Impossible Mission. But there’s a catch: If that message is spreading abuse or misinformation, perhaps threatening violence, then anonymity may need to go out the window—the sender needs to be held accountable.
And that’s the challenge: Is there a way to allow people to send private, secure, and untraceable messages, but still track threatening messages?
Mayank Varia may have cracked the riddle. As a cryptographer and computer scientist, Varia is an expert in the social impact of algorithms and programs and develops systems that balance privacy and security with transparency and social justice. Working with a team of computer scientists from Boston University, he has developed a program called Hekate – aptly named after the ancient Greek goddess of magic and spells – that can be screwed into a secure messaging app to increase its confidentiality while simultaneously protecting it Allows moderators to crack down on abuse. The team is presenting its findings at the 31st USENIX Security Symposium.
“Our goal in cryptography is to develop tools and systems that enable people to get things done securely in the digital world,” says Varia, associate professor in BU’s Faculty of Computer Science and Data Science. “The question addressed in our paper is what is the most effective way to build an abuse reporting mechanism – the quickest and most efficient way to provide the strongest security guarantees and break them as weakly as possible? “
It’s an approach he’s also taking beyond messaging apps, building online tools that allow local governments to track gender pay gaps — without accessing private payroll data — and allow victims of sexual assault to be safer with their attackers Report.
Everything is to be denied
When two people are chatting in a private room, what they are talking about is just between them – there is no paper trail, no recording; the conversation lives on in memory alone. Put that same conversation online – Twitter, Facebook, email – and it’s a different story. Every word is saved for the story. Sometimes that’s good, but just as often it’s not. An activist in an authoritarian state trying to reach a journalist or a patient seeking help for a private health problem may not want his words out into the world or kept in an archive.
This is where end-to-end encryption comes into play. Popular with apps like WhatsApp and Signal, it encrypts sent messages into an unreadable format and only decrypts them when they land on the recipient’s phone. It also ensures that messages sent from one person to another cannot be traced back to the sender; Just like this private personal chat, it is a conversation with no traces or records – everything is undeniable.
“The goal of these deniable messaging systems is that even if my phone is compromised after we have had an encrypted messaging conversation, there are no digital breadcrumbs allowing an outside person to know for sure what we sent or even who said it. ‘ says Varia.
Amnesty International calls encryption a human right and argues it is “an essential protection of [everyone’s] Rights to privacy and freedom of expression” and especially important for those fighting corruption or challenging governments. However, like much of the online world, this privacy can be exploited or twisted for more sinister purposes. “There are certain times when this can be a bad thing,” says Varia. “Suppose the messages someone is sending are harassing and offensive and you want to seek help, you want to be able to prove to the moderator what the message was and who told you.”
A study of elementary, middle and high school students in Israel, where reportedly more than 97 percent of children use WhatsApp, found 30 percent had been bullied on the app, while British prosecutors said end-to-end encryption could affect their abilities to catch and stop child molesters. Extremist groups, from Islamic State to domestic terrorists, have relied on encrypted apps like Telegram and Signal to spread their calls for violence.
The challenge for tech companies is to find a way to support the right to privacy with the need for accountability. Hecate offers a way to do both – it allows app users to deny they’ve ever sent a message, but also be reported if they say something offensive.
A message in invisible ink
Developed by Varia and graduate students Rawane Issa (GRS’22) and Nicolas Alhaddad (GRS’24), Hecate begins with the responsibility side of this contradiction debatable and understandable Combination. Using the program, an app’s moderator creates a unique stack of electronic signatures—or tokens—for each user. When that user sends a message, a hidden token goes with it. If the recipient decides to report this message, the moderator can examine the sender’s token and take action. It’s called asymmetric message franking.
The resiliency, Varia says, the part that allows for denial is that the token is only useful to the moderator.
“The token is an encrypted statement that only the presenter can read — it’s like they wrote a message to their future self in invisible ink,” says Varia. “The moderator is the one who creates these tokens. That’s the nifty thing about our system: even if the moderator goes rogue, they can’t show and convince the rest of the world – they have no digital proof, no breadcrumbs to show others.”
The user can maintain the denial – at least publicly.
Similar message postage systems already exist — Facebook parent Meta uses one on WhatsApp — but Varia says Hecate is faster, more secure, and more future-proof than current programs.
“Hecate is the first message postage system that simultaneously achieves fast execution on a phone and for the moderator server, support for message routing, and compatibility with anonymous communications networks such as Signal’s sealed transmitter,” says Varia. “Previous designs have achieved at most two of these three goals.”
The Civil Society Influence of Algorithms
The team says that with just a few months of custom development and testing, Hecate could be ready for implementation in apps like Signal and WhatsApp. But despite its technological advantages, Varia recommends that companies approach Hecate with caution until they fully evaluate its potential societal impact.
“It’s a question, can we build that, it’s also a question should are we building this?” says Varia. “We can try to design these tools to provide security benefits, but there may be longer dialogues and discussions with affected communities. Are we getting the right security concept for, say, the journalist, the dissident, the people who are being harassed online?”
As leader of the CDS hub for Civic Tech Impact, Varia is used to considering the societal and political impact of his research. The hub’s goal is to develop software and algorithms that advance public interest, whether it’s helping combat misinformation or promoting increased government transparency. A theme of recent projects is the creation of programs that, like Hecate, cross the line between privacy and accountability.
For example, during a recent partnership with the Boston Women’s Workforce Council, BU computer scientists developed a gender pay gap calculator that allows companies to share salaries with the city without sensitive payroll data leaving their servers.
“We’re developing tools that people can use — it sounds counterintuitive — to calculate data they can’t see,” says Varia, a member of the federal government’s Advisory Committee on Evidence Preservation Data. “Maybe I want to send you a message, but I don’t want you to read it; It’s weird, but maybe some of us are sending information and we want you to be able to do some calculations about it.
That has piqued the interest of the Defense Advanced Research Projects Agency and the Naval Information Warfare Center, both of which funded the work that led to Hecate and have an interest in asking computer experts to process data without ever revealing the secrets it hides see.
Varia’s encryption approach could also benefit survivors of sexual abuse. He recently teamed up with the San Francisco-based nonprofit Callisto to develop a new secure sexual assault reporting system. Inspired by the #MeToo movement, their goal is to help victims of assault who are afraid to speak up.
“They report their sexual assault case into our system, and that report sort of disappears into the ether,” says Varia. “But if someone else reports being attacked by the same perpetrator, then – and only then – does the system recognize the existence of that match.”
That information goes to a volunteer attorney – bound by attorney-client privilege – who can then work with victims and survivors on next steps. Like Hecate, Varia says it strikes a balance between privacy and openness, between denial and traceability.
“When we talk about trade-offs between privacy, digital civil rights and other rights, there are sometimes natural tensions,” says Varia. “But we can do both: we don’t have to build a system that allows for mass surveillance, broad attribution of metadata about who is talking to whom; We can ensure strong privacy and human rights while providing trust and security online and helping people who need it.”