Efficient and Privacy-Preserving Argmax Approximation Using Homomorphic Encryption for Neural Networks

Read the full article See related articles

Discuss this preprint

Start a discussion What are Sciety discussions?

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

Privacy-preserving neural networks represent a compelling approach for enabling secure training and inference without compromising user data confidentiality. Fully Homomorphic Encryption (FHE) is a cornerstone technology in this domain, as it permits computation over encrypted data. However, FHE inherently supports only addition and multiplication operations, making the implementation of non-linear functions—such as activation, Argmax, and max-pooling—particularly challenging when applied to cipher texts. This work addresses the complexity of executing the Argmax operation homomorphically, which is essential for identifying the index of the maximum element in a dataset. Building upon existing methods that employ compositions of low-degree minimax polynomials to approximate non-linear functions like sign and Argmax, we introduce a refined homomorphic Argmax approximation algorithm. Our approach enhances both accuracy and efficiency through a multi-phase design comprising rotation accumulation, tree-based comparisons, normalization, and final output selection. We integrate the proposed approximation algorithm into a neural network architecture and conduct comparative evaluations. The results demonstrate that our method not only yields a modest improvement in prediction accuracy but also reduces inference latency by 58%, primarily due to the optimization of homomorphic sign and rotation operations.

Article activity feed