Abstract
We analyze the estimation of the angle, scalar product, and the Euclidean distance of real-valued vectors using binary vectors with controlled sparsity. Transformation is carried out by projection using a binary random matrix with elements {0,1} and the output threshold transformation. We also provide a comparative analysis of the error obtained while estimating the similarity measures of input vectors by some similarity measures of output binary vectors based on their scalar product.
Article PDF
Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.Avoid common mistakes on your manuscript.
References
D. A. Rachkovskij, “Representation and processing of structures with binary sparse distributed codes,” IEEE Trans. on Knowledge and Data Engineering, 13, No. 2, 261–276 (2001).
D. A. Rachkovskij, “Some approaches to analogical mapping with structure sensitive distributed representations,” J. Experim. and Theor. Artificial Intelligence, 16, No. 3, 125–145 (2004).
M. Stanojevic and S. Vranes, “Semantic approach to knowledge processing,” WSEAS Trans. on Inform. Sci. and Appl., 5, No. 6, 913–922 (2008).
S. V. Slipchenko and D. A. Rachkovskij, “Analogical mapping using similarity of binary distributed representations,” Intern. J. Inform. Theories and Applications, 16, No. 3, 269–290 (2009).
R. W. Gayler and S. D. Levy, “A distributed basis for analogical mapping,” in: Proc. 2nd Intern. Analogy Conference, NBU Press, Sofia (2009), pp. 165–174.
D. A. Rachkovskij and S. V. Slipchenko, “Similarity-based retrieval with structure-sensitive sparse binary distributed representations,” Computational Intelligence, 28, No. 1, 106–129 (2012).
V. I. Gritsenko, D. A. Rachkovskij, A. D. Goltsev, V. V. Lukovych, I. S. Misuno, E. G. Revunova, S. V. Slipchenko, A. M. Sokolov, and S. A. Talayev, “Neural distributed representation for intelligent information technologies and modeling of thinking,” Cybernetics and Computer Engineering, 173, 7–24 (2013).
M. Pickett and D. Aha, “Using cortically-inspired algorithms for analogical learning and reasoning,” Biologically Inspired Cognitive Architectures, 6, 76–86 (2013).
B. Emruli, R. W. Gayler, and F. Sandin, “Analogical mapping and inference with binary spatter codes and sparse distributed memory,” Intern. Joint Conference on Neural Networks (IJCNN), 4–9 Aug. 2013, Dallas, TX, IEEE, (2013), pp. 1–8.
B. Emruli and F. Sandin, “Analogical mapping with sparse distributed memory: A simple model that learns to generalize from examples,” Cognitive Computation, 6, No. 1, 74–88 (2014).
D. Widdows and T. Cohen, “Reasoning with vectors: A continuous model for fast robust inference,” Logic J. of the IGPL, 23, No. 2, 141–173 (2015).
P. Indyk and J. Matousek, “Low-distortion embeddings of finite metric spaces,” in: J. E. Goodman and J. O’Rourke(eds.), Handbook of Discrete and Computational Geometry, Discrete Mathematics and its Applications, Chapman & Hall/CRC, Boca Raton (2004), pp. 177–196.
S. R. Upadhyaya, “Parallel approaches to machine learning — A comprehensive survey,” J. of Parallel and Distributed Computing, 73, No. 3, 284–292 (2013).
N. Kussul, A. Shelestov, S. Skakun, and O. Kravchenko, “High-performance intelligent computations for environmental and disaster monitoring,” Intern. J. Inform. Technologies and Knowledge, 3, No. 2, 135–156 (2009).
N. Kussul, A. Shelestov, and S. Skakun, “Grid technologies for satellite data processing and management within international disaster monitoring projects,” in: S. Fiore and G. Aloisio (eds.), Grid and Cloud Database Management, Springer-Verlag, Berlin–Heidelberg (2011), pp. 279–306.
L. J. P. Van der Maaten, E. O. Postma, and H. J. van den Herik, “Dimensionality reduction: A comparative review,” in: Tech. Rep. TiCC-TR 2009-005, Tilburg Centre Creative Comput., Tilburg Univ., Tilburg, (2009).
C. J. C. Burges, “Dimension reduction: A guided tour,” Foundations and Trends in Machine Learning, 2, No. 4, 275–365 (2010).
A. Sokolov and S. Riezler, “Task-driven greedy learning of feature hashing functions,” in: Proc. NIPS’13 Workshop, Big Learning: Advances in Algorithms and Data Management, Lake Tahoe, (2013), pp. 1–5.
S. S. Vempala, The Random Projection Method, American Mathematical Society, Providence (2004).
P. Li, T. J. Hastie, and K. W. Church, “Very sparse random projections,” in: 12th ACM SIGKDD Intern. Conf. on Knowledge Discovery and Data Mining, ACM Press, Philadelphia (2006), pp. 287–296.
E. G. Revunova and D. A. Rachkovskij, “Using randomized algorithms for solving discrete ill-posed problems,” Inform. Theories and Applications, 16, No. 2, 176–192 (2009).
E. G. Revunova, “Study of error components for solution of the inverse problem using random projections,” Mathematical Machines and Systems, No. 4, 33–42 (2010).
E. G. Revunova and D. A. Rachkovskij, “Stable transformation of a linear system output to the output of system with a given basis by random projections,” in: The 5th International Workshop on Inductive Modelling, IWIM (2012), pp. 37–41.
D. A. Rachkovskij and E. G. Revunova, “Randomized method for solving discrete ill-posed problems,” Cybern. Syst. Analysis, 48, No. 4, 621–635 (2012).
D. A. Rachkovskij, I. S. Misuno, and S. V. Slipchenko, “Randomized projective methods for construction of binary sparse vector representations,” Cybern. Syst. Analysis, 48, No. 1, 146–156 (2012).
D. A. Rachkovskij, “Vector data transformation with random binary matrices,” Cybern. Syst. Analysis, 50, No. 6, 960–968 (2014).
D. A. Rachkovskij, “Formation of similarity-reflecting binary vectors with random binary projections,” Cybern. Syst. Analysis, 51, No. 2, 313–323 (2015).
G. Rinkus, “Quantum computation via sparse distributed representation,” NeuroQuantology, 10, No. 2, 311–315 (2012).
G. J. Rinkus, “Sparsey™: Event recognition via deep hierarchical sparse distributed codes,” Frontiers in Computational Neuroscience, 8, Article 160, 1–44 (2014).
E. M. Kussul and D. A. Rachkovskij, “Multilevel assembly neural architecture and processing of sequences,” in: A. V. Holden and V. I. Kryukov (eds.), Neurocomputers and Attention: Vol. II, Connectionism and Neurocomputers, Manchester University Press, Manchester–New York (1991), pp. 577–590.
D. A. Rachkovskij, E.M. Kussul, and T. N. Baidyk, “Building a world model with structure-sensitive sparse binary distributed representations,” Biologically Inspired Cognitive Architectures, 3, 64–86 (2013).
A. Kartashov, A. Frolov, A. Goltsev, and R. Folk, “Quality and efficiency of retrieval for Willshaw-like autoassociative networks: III. Willshaw-Potts model,” Network: Computation in Neural Systems, 8, No. 1, 71–86 (1997).
A. A. Frolov, D. A. Rachkovskij, and D. Husek, “On information characteristics of Willshaw-like auto-associative memory,” Neural Network World, 12, No. 2, 141–158 (2002).
A. A. Frolov, D. Husek, and D. A. Rachkovskij, “Time of searching for similar binary vectors in associative memory,” Cybern. Syst. Analysis, 42, No. 5, 615–623 (2006).
D. Kleyko, E. Osipov, A. Senior, A. I. Khan, and Y. A. Sekercioglu, “Holographic graph neuron: A bio-inspired architecture for pattern processing” (2015), http://arxiv.org/pdf/1501.03784v1.pdf.
B. Emruli, F. Sandin, and J. Delsing, “Vector space architecture for emergent interoperability of systems by learning from demonstration,” Biologically Inspired Cognitive Architectures, 11, 53–64 (2015).
D. W. Nowicki and O. K. Dekhtyarenko, “Averaging on Riemannian manifolds and unsupervised learning using neural associative memory,” in: Proc. ESANN 2005, Bruges, Belgium, April, 27–29 (2005), pp. 181–189.
A. Knoblauch, G. Palm, and F. T. Sommer, “Memory capacities for synaptic and structural plasticity,” Neural Computation, 22, No. 2, 289–341 (2010).
V. Korolev and I. Shevtsova, “An improvement of the Berry–Esseen inequality with applications to Poisson and mixed Poisson random sums,” Scandinavian Actuarial J., No. 2, 81–105 (2012).
I. G. Shevtsova, “On the absolute constants in the Berry–Esseen-type inequalities,” in: Doklady Mathem., 89, No. 3, 378–381 (2014).
E. S. Ventsel, Probability Theory [in Russian], Nauka, Moscow (1969).
D. Widdows and T. Cohen, “Real, complex, and binary semantic vectors,” Lecture Notes in Computer Science, 7620 (2012), pp. 24–35.
R. S. Omelchenko, “Spellchecking software on the basis of distributed representations,” Problemy Programmir., No. 4, 35–42 (2013).
T. Cohen, D. Widdows, M. Wahle, and R. Schvaneveldt, “Orthogonality and orthography: Introducing measured distance into semantic space,” Lecture Notes in Computer Science, 8369 (2014), pp. 34–46.
P. Kanerva, G. Sjodin, J. Kristoferson, R. Karlsson, B. Levin, A. Holst , J. Karlgren, and M. Sahlgren, “Computing with large random patterns,” in: Foundations of Real-World Intelligence, CSLI Publications, Stanford (2001), pp. 251–311.
A. M. Reznik, A. A. Galinskaya, O. K. Dekhtyarenko, and D. W. Nowicki, “Preprocessing of matrix QCM sensors data for the classification by means of neural network,” Sensors and Actuators B, 106, 158–163 (2005).
A. N. Chernodub, “Direct method for training feed-forward neural networks using batch extended Kalman filter for multi-step-ahead predictions,” Lecture Notes in Computer Science, 8131 (2013), pp. 138–145.
A. N. Chernodub, “Training neural networks for classification using the extended Kalman filter: A comparative study,” Optical Memory and Neural Networks, 23, No. 2, 96–103 (2014).
I. S. Misuno, D. A. Rachkovskij, and S. V. Slipchenko, “Vector and distributed representations reflecting semantic relatedness of words,” Mathematical Machines and Systems, No. 3, 50–67 (2005).
A. Sokolov, “LIMSI: Learning semantic similarity by selecting random word subsets,” in: Proc. 6th Intern. Workshop on Semantic Evaluation (SEMEVAL’12), Association for Computational Linguistics (2012), pp. 543–546.
A. Sokolov, “Vector representations for efficient comparison and search for similar strings,” Cybern. Syst. Analysis, 43, No. 4, pp. 484–498 (2007).
D. Kleyko and E. Osipov, “On bidirectional transitions between localist and distributed representations: The case of common substrings search using vector symbolic architecture,” Procedia Computer Science, 41, 104–113 (2014).
O. Rasanen and S. Kakouros, “Modeling dependencies in multiple parallel data streams with hyperdimensional computing,” Signal Processing Letters, IEEE, 21, No. 7, 899–903 (2014).
G. L. Recchia, M. Sahlgren, P. Kanerva, and M. N. Jones, “Encoding sequential information in semantic space models: Comparing holographic reduced representation and random permutation,” Computational Intelligence and Neuroscience, Article ID 986574 (2015).
V. Kvasnicka and J. Pospichal, “Deductive rules in holographic reduced representation,” Neurocomputing, 69, 2127–2139 (2006).
S. I. Gallant and T. W. Okaywe, “Representing objects, relations, and sequences,” Neural Computation, 25, No. 8, pp. 2038–2078 (2013).
F. Sandin, A. I. Khan, A. G. Dyer, A. H. M. Amin, G. Indiveri, E. Chicca, and E. Osipov, “Concept learning in neuromorphic vision systems:What can we learn from insects?” J. Software Eng. and Applic., 7, No. 5 387–395 (2014).
A. A. Letichevsky, “Theory of interaction, insertion modeling, and cognitive architectures,” Biologically Inspired Cognitive Architectures, 8, 19–32 (2014).
A. A. Letichevsky, O. O. Letychevskyi, V. S. Peschanenko, and A. A. Huba, “Generating symbolic traces in the insertion modeling system,” Cybern. Syst. Analysis, 51, No. 1, 5–15 (2015).
Author information
Authors and Affiliations
Corresponding author
Additional information
Translated from Kibernetika i Sistemnyi Analiz, No. 5, September–October, 2015, pp. 157–168.
Rights and permissions
About this article
Cite this article
Rachkovskij, D.A. Estimation of Vectors Similarity by Their Randomized Binary Projections. Cybern Syst Anal 51, 808–818 (2015). https://doi.org/10.1007/s10559-015-9774-1
Received:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10559-015-9774-1