HyperVector and SuperHyperVector Spaces with Applications in Machine Learning: Feature, Support, and Relevance Vectors

Read the full article See related articles

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

This paper introduces the concept of a SuperHyperVector Space, which extends classical vector spaces via theSuperHyperstructure framework built on the 𝑛th iterated powerset. We first review how Hyperstructures andSuperHyperstructures arise by applying the powerset and iterated powerset operations to a base set. We thenrecall that a vector space consists of a set equipped with addition and scalar multiplication satisfying linearityaxioms, and that a hypervector space generalizes this structure by using a scalar hyperoperation that assigns toeach scalar–vector pair a nonempty subset of vectors while preserving distributivity and associativity. Buildingon these ideas, we define SuperHyperVector Spaces by introducing a SuperHyperOperation on the iteratedpowerset of the underlying group and briefly examine their fundamental properties and hierarchical modelingpotential. Furthermore, in the context of Machine Learning, we investigate extensions of the HyperVectorconcept—including Feature Vector, Support Vector, and Relevance Vector—through the use of HyperVectorand SuperHyperVector representations.

Article activity feed