Encrypted cloud systems can now perform computations that produce decoy results to trick an attacker while the real answer stays hidden.
This framework allows users to maintain plausible deniability even when forced to reveal their encrypted keys. A user can provide a set of keys that decodes the data into a believable but fake result. The real computation remains mathematically invisible to the cloud provider and any coercive adversary. This is a massive leap for privacy in high-stakes environments like political activism or corporate espionage. It ensures that even if you are caught, you can still protect the truth without looking like you are hiding something.
Plausible Deniability in Fully Homomorphic Computation
arXiv · 2605.01985
We introduce \emph{Plausible Deniability in Fully Homomorphic Computation} (PD-FHC), a framework enabling users to outsource Boolean computations to an untrusted cloud while maintaining both computational privacy against honest-but-curious providers and plausible deniability against coercive adversaries. We define the notion of a \emph{Deniable Computation Medium} (DCM) and a \emph{Deniable Computation Scheme} (DCS) as medium-independent abstractions, then instantiate them using RGB images with