Distributed vector representations are a key bridging point between connectionist and symbolic representations in cognition. It is unclear how uncertainty should be modelled in systems using such representations. In this paper we discuss how bundles of symbols in certain Vector Symbolic Architectures (VSAs) can be understood as defining an object that has a relationship to a probability distribution, and how statements in VSAs can be understood as being analogous to probabilistic statements. The aim of this paper is to show how (spiking) neural implementations of VSAs can be used to implement probabilistic operations that are useful in building cognitive models. We show how similarity operators between continuous values represented as Spatial Semantic Pointers (SSPs), an example of a technique known as fractional binding, induces a quasi-kernel function that can be used in density estimation. Further, we sketch novel designs for networks that compute entropy and mutual information of VSA-represented distributions and demonstrate their performance when implemented as networks of spiking neurons. We also discuss the relationship between our technique and quantum probability, another technique proposed for modelling uncertainty in cognition. While we restrict ourselves to operators proposed for Holographic Reduced Representations, and for representing real-valued data. We suggest that the methods presented in this paper should translate to any VSA where the dot product between fractionally bound symbols induces a valid kernel.
Keywords: Bayesian modelling; Fractional binding; Probability; Spatial semantic pointers; Vector symbolic architecture.
© The Author(s), under exclusive licence to Springer Nature B.V. 2023. Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.