Authors
Peter Sutor, Dehao Yuan, Douglas Summers-Stay, Cornelia Fermuller, Yiannis Aloimonos
Publication date
2022/7/18
Conference
2022 International Joint Conference on Neural Networks (IJCNN)
Pages
1-10
Publisher
IEEE
Description
Hyperdimensional Computing affords simple, yet powerful operations to create long Hyperdimensional Vectors (hypervectors) that can efficiently encode information, be used for learning, and are dynamic enough to be modified on the fly. In this paper, we explore the notion of using binary hypervectors to directly encode the final, classifying output signals of neural networks in order to fuse differing networks together at the symbolic level. This allows multiple neural networks to work together to solve a problem, with little additional overhead. Output signals just before classification are encoded as hyper-vectors and bundled together through consensus summation to train a classification hypervector. This process can be performed iteratively and even on single neural networks by instead making a consensus of multiple classification hypervectors. We find that this outperforms the state of the art, or is on a par with it …
Total citations
2023202442
Scholar articles
P Sutor, D Yuan, D Summers-Stay, C Fermuller… - 2022 International Joint Conference on Neural …, 2022