‘Must Fix Trust’: Privacy-enhancing technologies as reductive tool
Listed in
This article is not in any list yet, why not save it to one of your lists.Abstract
Privacy-enhancing and other related digital technologies are marketed as increasing trust within thedigital society. They are deployed as means to assert trust in digital transactions and interactionsbetween actors. What this commentary argues is that digital trust is thereby reduced to the productof a technological iteration, insertion, or fix: more encryption ≃ more privacy ≃ more trust. Howeverrighteous it may be to foster privacy, the promise to uphold something as uncertain as trust by theuse of mathematics and/or statistics alone is short-sighted. Lofty notions like ‘data minimisation’ and‘privacy-by-design’ rest on deterministic assumptions. We suggest that in reductively appropriating the concept of trust and failing to meet expectations, the consequences of the technification of trust – i.e., the making of trust a product of technê (alone) – are paradoxical in that they actually undermine trust, by centralising power within tech companies.