Authors
Gabriel Recchia, Michael Jones, Magnus Sahlgren, Pentti Kanerva
Publication date
2010
Journal
Proceedings of the Annual Meeting of the Cognitive Science Society
Volume
32
Issue
32
Description
Encoding information about the order in which words typically appear has been shown to improve the performance of high-dimensional semantic space models. This requires an encoding operation capable of binding together vectors in an order-sensitive way, and efficient enough to scale to large text corpora. Although both circular convolution and random permutations have been enlisted for this purpose in semantic models, these operations have never been systematically compared. In Experiment 1 we compare their storage capacity and probability of correct retrieval; in Experiments 2 and 3 we compare their performance on semantic tasks when integrated into existing models. We conclude that random permutations are a scalable alternative to circular convolution with several desirable properties.
Total citations
201020112012201320142015201620172018201920202021202220231131692743344
Scholar articles