Skip to main content

Implementation Aspects of Tensor Product Variable Binding in Connectionist Systems

  • Conference paper
  • First Online:
Intelligent Systems and Applications (IntelliSys 2019)

Part of the book series: Advances in Intelligent Systems and Computing ((AISC,volume 1037))

Included in the following conference series:

Abstract

Tensor Product Variable Binding is an important aspect of building the bridge between the connectionist approach and the symbolic paradigm. It can be used to represent recursive structures in the tensor form that is an acceptable form for neural networks that are highly distributed in nature and, therefore, promise computational benefits from using it. However, practical aspects of tensor binding implementation using modern neural frameworks are not covered in the public research. In this work, we have made an attempt of building the topology that can perform binding operation for a well-known framework Keras. Also we make the analysis of the proposed solution in terms of its applicability for other important connectionist aspects of Tensor Product Variabl Binding. Proposed design of the binding network is the first step towards expressing any symbolic structure and operation in the neural form. This will make it possible for traditional decision making algorithms to be replaced with a neural network that brings scalability, robustness and guaranteed performance.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 169.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 219.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

Notes

  1. 1.

    https://github.com/demid5111/ldss-tensor-structures.

References

  1. Smolensky, P.: Tensor product variable binding and the representation of symbolic structures in connectionist systems. Artif. Intell. 46(1–2), 159–216 (1990)

    Article  MathSciNet  Google Scholar 

  2. Wang, H., Dou, D., Lowd, D.: Ontology-based deep restricted Boltzmann machine. In: International Conference on Database and Expert Systems Applications, pp. 431–445. Springer (2016)

    Google Scholar 

  3. Margem, M., Yilmaz, O.: How much computation and distributedness is needed in sequence learning tasks? In: Artificial General Intelligence, pp. 274–283. Springer (2016)

    Google Scholar 

  4. Dehaene, S., Meyniel, F., Wacongne, C., Wang, L., Pallier, C.: The neural representation of sequences: from transition probabilities to algebraic patterns and linguistic trees. Neuron 88(1), 2–19 (2015)

    Article  Google Scholar 

  5. Serafini, L., d’Avila Garcez, A.: Logic tensor networks: deep learning and logical reasoning from data and knowledge. arXiv preprint arXiv:1606.04422 (2016)

  6. Browne, A., Sun, R.: Connectionist inference models. Neural Netw. 14(10), 1331–1355 (2001)

    Article  Google Scholar 

  7. Teso, S., Sebastiani, R., Passerini, A.: Structured learning modulo theories. Artif. Intell. 244, 166–187 (2017)

    Article  MathSciNet  Google Scholar 

  8. Besold, T.R., Kühnberger, K.-U.: Towards integrated neural-symbolic systems for human-level AI: two research programs helping to bridge the gaps. Biol. Inspired Cogn. Archit. 14, 97–110 (2015)

    Google Scholar 

  9. Gallant, S.I., Okaywe, T.W.: Representing objects, relations, and sequences. Neural Comput. 25(8), 2038–2078 (2013)

    Article  MathSciNet  Google Scholar 

  10. Blacoe, W., Lapata, M.: A comparison of vector-based representations for semantic composition. In: Proceedings of the 2012 Joint Conference on Empirical Methods in Natural Language Processing and Computational Natural Language Learning, pp. 546–556. Association for Computational Linguistics (2012)

    Google Scholar 

  11. Cheng, J., Wang, Z., Wen, J.-R., Yan, J., Chen, Z.: Contextual text understanding in distributional semantic space. In: Proceedings of the 24th ACM International on Conference on Information and Knowledge Management, pp. 133–142. ACM (2015)

    Google Scholar 

  12. Rumelhart, D.E., McClelland, J.L., PDP Research Group, et al.: Parallel Distributed Processing, vol. 1. MIT Press, Cambridge (1987)

    Google Scholar 

  13. Le, Q., Mikolov, T.: Distributed representations of sentences and documents. In: International Conference on Machine Learning, pp. 1188–1196 (2014)

    Google Scholar 

  14. Mikolov, T., Chen, K., Corrado, G., Dean, J.: Efficient estimation of word representations in vector space. arXiv preprint arXiv:1301.3781 (2013)

  15. Shang, L., Lu, Z., Li, H.: Neural responding machine for short-text conversation. arXiv preprint arXiv:1503.02364 (2015)

  16. Widdows, D., Cohen, T.: Reasoning with vectors: a continuous model for fast robust inference. Log. J. IGPL 23(2), 141–173 (2014)

    Article  MathSciNet  Google Scholar 

  17. Frank, R., Mathis, D., Badecker, W.: The acquisition of anaphora by simple recurrent networks. Lang. Acquis. 20(3), 181–227 (2013)

    Article  Google Scholar 

  18. Yilmaz,Ö., d’Avila Garcez, A.S., Silver, D.L.: A proposal for common dataset in neural-symbolic reasoning studies. In: NeSy@HLAI (2016)

    Google Scholar 

  19. Wang, H.: Semantic deep learning, pp. 1–42. University of Oregon (2015)

    Google Scholar 

  20. Rumelhart, D.E., Hinton, G.E., McClelland, J.L., et al.: A general framework for parallel distributed processing. Explor. Microstruct. Cogn. 1(45–76), 26 (1986)

    Google Scholar 

  21. Chollet, F., et al.: Keras (2015). https://keras.io

  22. Abadi, M., Agarwal, A., Barham, P., Brevdo, E., Chen, Z., Citro, C., Corrado, G.S., Davis, A., Dean, J., Devin, M., Ghemawat, S., Goodfellow, I., Harp, A., Irving, G., Isard, M., Jia, Y., Jozefowicz, R., Kaiser, L., Kudlur, M., Levenberg, J., Mané, D., Monga, R., Moore, S., Murray, D., Olah, C., Schuster, M., Shlens, J., Steiner, B., Sutskever, I., Talwar, K., Tucker, P., Vanhoucke, V., Vasudevan, V., Viégas, F., Vinyals, O., Warden, P., Wattenberg, M., Wicke, M., Yu, Y., Zheng, X.: TensorFlow: large-scale machine learning on heterogeneous systems (2015). Software available from tensorflow.org

  23. Paszke, A., Gross, S, Chintala, S., Chanan, G., Yang, E., DeVito, Z., Lin, Z., Desmaison, A., Antiga, L., Lerer, A.: Automatic differentiation in pytorch (2017)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Alexander Demidovskij .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2020 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Demidovskij, A. (2020). Implementation Aspects of Tensor Product Variable Binding in Connectionist Systems. In: Bi, Y., Bhatia, R., Kapoor, S. (eds) Intelligent Systems and Applications. IntelliSys 2019. Advances in Intelligent Systems and Computing, vol 1037. Springer, Cham. https://doi.org/10.1007/978-3-030-29516-5_9

Download citation

Publish with us

Policies and ethics